Closing Keynote: Quantum Computing: Reality vs. Hype - John Preskill - 6/27/2019

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everybody we're approaching the end of the conference and of course we have to look forward to the future and so what better way to do it then to hear about quantum computing from one of the world's leading experts in the field so John Prescott is a professor at Caltech actually he is the Richard P Fineman professor theoretical physics he's also Davis leadership chair and director of Institute for quantum information and matter and many people don't know this but he's also an accomplished stage performer and singer and after the lecture he will he will entertain us with his rendition of one entangled evening right John whatever you say George John I'll be talking about quantum information which by now is a rather broad topic among the scientific themes that it encompasses are these improving metrology by using quantum systems protecting our privacy using quantum communication distributing quantum states on a global scale probing exotic quantum many-body phenomena and speeding up solutions to hard problems I'll be talking about the last two topics the way I look at quantum information science is that we are in the early stages of the exploration of a new frontier of the physical sciences what we could call the complexity frontier or entanglement frontier we are now acquiring and perfecting the tools to create and precisely control very complex states of many quantum particles highly entangled States stayed so complex that we can simulate them with our most powerful classical digital computers or predict how they'll behave very well with existing theoretical tools and that opens new opportunities for fundamental discovery our conviction that this is a promising frontier to explore rests largely on these two fundamental concepts quantum complexity which is our reason for thinking that quantum computing will be powerful and quantum error correction our reason for believing that it's possible to scale up quantum computers to large systems that can solve hard problems and both of those concepts build on the underlying idea of quantum entanglement the characteristic correlations among the parts of a quantum system that are very different from correlations in classical systems you can think about it this way we can imagine a book which is a 100 pages long and if it's an ordinary classical book written in bits you can read the pages one at a time and every time you read another page you acquire another 1% of the information that's in the book and after you've read all 100 pages one by one you'll know everything that's in the book but if on the other hand it's a quantum put written in quantum information where the pages are highly entangled with one another then when you look at a page what you see is just random gibberish revealing very little information that distinguishes one highly entangled book from another and even after you've read all the pages one by one you know almost nothing about the content of the book and that's because the information in the quantum book doesn't reside in the individual pages it's encoded almost entirely in the correlations among the pages so that's quantum entanglement and it's very different from the notion of correlation we normally encounter now to build a quantum state you don't use ordinary bits you use qubits two-level systems which are coherent and interacting a qubit can be realized physically in a lot of different ways it could for example be carried by the polarization state of a single photon or the spin state of a single electron or the internal energy eigenstate of a single atom or it could also be carried by a more complicated system like a superconducting circuit where the qubit is encoded in the collective motion of billions of Cooper pairs and what's really interesting about quantum entanglement is that it's extreme complex to describe classically so that if I wanted to give a complete description of all the correlations among the qubits in a state of just a few hundred cubits written in classical language I would have to write down more bits than the number of atoms in the visible universe so it will never be possible even in principle to write that description down now that in itself doesn't guarantee that a quantum computer will be able to do things that we care about we have several kinds of arguments indicating that a quantum computer would really be a powerful device one is that we know of problems which are believed to be hard classically for which efficient quantum algorithms have been discovered the best-known example is the problem of finding the prime factors of large composite integers and we think factoring is a hard problem because really smart people for decades have been trying to find better factoring algorithms and still the best algorithms that we have have a run time which is super polynomial in the number of digits of the number to be factored we also have arguments from theoretical computer science indicating under reasonable assumptions that if you do a relatively modest scale quantum computation and then measure all the qubits you're sampling from a probability distribution that we can't sample from efficiently using any classical means but maybe most tellingly we just don't know how to simulate a quantum computer using the classical one and that's not for lack of trying physicists and chemists have been trying for decades to find better ways of simulating complex highly entangled quantum systems but still the best algorithms that we have in the hardest instances have a run time which is exponential in the system size now we shouldn't think that quantum computers have unlimited power we don't expect for example that they'd be able to efficiently find exact solutions to the worst case instances of np-hard optimization problems so I think it's one of the most interesting things that ever that's ever been said about the difference between quantum and classical that there are problems which are classically hard and quantumly easy so it becomes a compelling question to understand what are these problems that we can solve onam Li and we can't solve classically and we've learned some things about that I think we still have a lot to learn about it but if you're looking for such problems and you're a physicist the natural thing to think about is the problem of simulating a complex quantum system of many particles some years ago two great physicists Laughlin and Pines pointed out that we really have a theory of everything that matters in everyday life it's the theory of electrons interacting electromagnetically with atomic nuclei that underlies all of chemistry and materials and biology and we know how to write down the Schrodinger equation we just can't solve it they went so far as to say no computer existing or that will ever exist can break this barrier and they remark we've succeeded in reducing all of ordinary physical behavior to a simple correct theory of everything only to discover it's revealed exactly nothing about many things of great importance and the things of great importance they had in mind are situations where quantum entanglement has an essential underlying role now actually some years before Laughlin and pines made this statement richard fineman had articulated a rebuttal when he said naturism classical dammit and if you want to make a simulation of nature you better make it quantum mechanical and Laughlin and pines knew that Fineman had 20 years earlier proposed the idea of quantum computer for solving quantum problems but they dismissed the idea as impractical but I think to a physicists like Fineman what's really most important about quantum computing is that we expect that with a quantum computer we'd be able to efficiently simulate any process that occurs in nature we don't think that's true of ordinary classical computers which are unable as far as we know to simulate highly entangled systems of many particles so with the quantum computer we'd be able to probe more deeply into the properties of complex molecules and exotic materials and also study fundamental physics and new ways for example by simulating strongly coupled quantum field theories or quantum behavior of a black hole or the universe right after the Big Bang but in defense of Laughlin and pines it's been almost 40 years since Laughlin or since Fineman made this call for the creation of a new field of quantum computing and we're only now getting to the stage where quantum devices are capable of doing interesting things quantum computing is hard it's hard because we want to construct a platform that simultaneously fulfills three criteria which are hard to reconcile we want the qubits to be able to interact strongly with one another so we can process the quantum information but we don't want the qubits to interact with the outside world which would drive decoherence and cause a quantum computation to fail but we do want to be able to control the quantum computer from outside and eventually read it out and it's very hard to realize a quantum system that realizes all of these does it errata so it's only after decades of improvements in materials and control protocols and qubit design that we've gotten as far as we have so where are we were on the brink of what's been called quantum supremacy that means a quantum computer which is capable of performing some tasks would surpass what the most powerful existing digital computers can do not necessarily test that we care about for other reasons but nevertheless some computations are too hard classically and we'll be able to do them we think with near-term devices so I thought there should be a word for this era which is starting to dawn so I made up a word misc it means noisy intermediate scale quantum intermediate scale means we're talking about devices of sufficient size say more than 50 cubits so that we can't by brute force simulate them with the most powerful digital computers we have but noisy emphasizes that these systems won't be error corrected the gates will be noisy and those errors will put limits on the computational power of NIST technology now for physicists misc is exciting it means we'll be able to explore the properties of highly entangled matter in a regime which hasn't been experimentally accessible up until now and there may be important applications that are a broader practical interest for this technology but we're not sure about that we shouldn't think of NIST that is something that's going to change the world all by itself it should be regarded as a step towards the more powerful quantum technologies we expect to develop in the future I actually am confident that quantum technology will eventually have a transformative effect on society but that impact could still be decades away and we're not really sure how long it's going to take so what is the state of the art in quantum hardware there are a number of quantum devices operating with 20 or more qubits both IBM and Google have announced that they built devices with 50 or more qubits that we haven't yet seen experimental results from those those IBM and Google devices are based on superconducting technology another approach is to trap atomic ions using electromagnetic fields ion q is a start-up company that's building a trapped ion processor with 32 qubits and aside from these general purpose quantum computing platforms there are also quantum simulation devices which don't have complete control or not Universal quantum computers but nevertheless are performing simulations of dynamics in situations where we think it's hard to do the simulation classically for example a Harvard device with 51 neutral atoms and marriland device with 53 trapped ions and there are a lot of other interesting platforms that are steadily advancing but although I've emphasized the number of qubits as an important milestone that's not the only thing we care about we also care about the quality of the qubits and in particular the probability of error per gate for the two qubit it's the entangling gates in the device which we use to build up highly entangled states and under the best conditions today with trapped ion or superconducting circuit based quantum computers the probability of an error per gate is about one in a thousand and we haven't yet seen complicated many qubit devices that achieve error rates as good as that so naively we would expect that we can't carry out a computation with more than a few thousand gates and read out a result with a meaningful signal-to-noise now we know eventually how to do much better than that by using quantum error correction but that has a very substantial overhead cost so we don't expect to be able to implement quantum error correction in a serious way for some time so if you want to be optimistic you can look at it this way which was recently dubbed nevins law after Hartmut Neven at google there's some evidence that these two qubit gate error rates are improving exponentially in time if you go back a decade or so and as the gate error rates improve the volume of the circuit that you can execute with some fixed fidelity in the quantum circuit is going up so that volume is increasing roughly exponentially but the cost of doing a classical simulation of that quantum circuit is exponential in the volume so the classical cost of simulating the most powerful existing quantum computers is rising doubly exponentially which is really fast so from that point of view we might expect quantum computers to overtake classical ones in the near future but that doesn't tell us exactly one will be able to run the applications that we care about so how are we going to use these near-term devices which are starting to become available there's an emerging paradigm about how they could be used a kind of hybrid quantum classical scheme which could work like this with a relatively small quantum computer we execute a fairly simple quantum computation and then we read out all the qubits we measure them we feed those measurement outcomes to a classical computer which returns with instructions to slightly modify the quantum computation and that cycle is iterated until convergence with the goal of finding an approximate minimum of some cost function for the purpose of solving an optimization problem now as I've said we don't expect the quantum computers will be able to find exact solutions to the hard instances of np-hard optimization problems but it's possible they'll be able to find better approximate solutions and find those approximate solutions faster if we apply this scheme to classical optimization problems we usually call it the quantum approximate optimization algorithm queue AOA it can also be applied to quantum problems like trying to find low energy states of many body Hamiltonians in which case it's usually called the variational quantum eigen solver or vqe but the idea is essentially the same in both cases now should we expect these NIST devices running this type of scheme to be able to outperform the best classical computers for solving optimization problems we really don't know we're gonna have to try it and see how well it works and experiment and try to improve the applications but it's really a lot to ask because the classical methods are sophisticated and have been honed over decades of development and these NIST devices are just starting to become available now one concern is that the classical time or the classical computational load of this hybrid scheme is actually quite high because it's really a variational method where there are some classical parameters and we're trying to search in that space of parameters to find the minimum of cost function and that search can be highly complex so we'd like to find ways of reducing that classical load there suggested ways of doing that that may be by running smaller instances we can learn better starting points for these variational calculations for larger instances or in some cases symmetries might reduce the classical load but we have to be careful that the symmetries don't make it easier to solve the problem classically if we're looking for a quantum advantage I think the underlying concern is we really don't have a very persuasive theoretical argument that there will be a quantum advantage achieved witness devices running this type of scheme now if you look at the history of classical computing there are lots of examples of algorithms which it turned out or were useful but at least an advanced theorists were not able to give convincing arguments that they would work well a current example is deep learning finding many applications but we still have a very limited understanding from a theoretical perspective of why for some applications we can train networks efficiently and maybe near-term quantum computing will be like that we have some ideas of what we want to try they're really heuristic algorithms we don't have performance guarantees but we'll experiment and through experimentation hope to find better applications but I emphasize again that in the nasarah the errors in gates are going to put severe limits on the circuit size that we can explore in the long run will do better with quantum error correction in the near term having more accurate gates will help it'll mean we'll be able to execute deeper circuits but for a while we'll probably be looking at systems with about 100 cubits and with depth less than 100 and finding interesting applications for circuits of that size is going to take some exploration and a vibrant dialogue between the users and the algorithm experts so how should we find applications well here's one perspective you might have instead of thinking of a hard problem and asking how to speed it up that's what quantum computers are good at and build an application from that and Scott followed his own advice and proposed that near-term quantum computers could be used to produce certified randomness you can download a string of random numbers over the internet from a NIST website for example but how do you know those are really random numbers well you just have to trust NIST but with a quantum device you can actually provide a guarantee that randomness is being generated because of the intrinsic randomness of the quantum measurement process it's one way this might work is there's some quantum circuit of sufficient size that it's hard to simulate classically but you can through a large scale classical computation simulate it and the user can ask the server to perform this quantum computation and report the outcome of measuring all the qubits at the end of the computation and there should be a lot of entropy in those measurement outcomes some intrinsic randomness which we can distill to produce a very high quality string of random numbers but you need some guarantee that the results were not provided by some classical computer masquerading as a quantum computer that could arise say because the computation is hard enough that we don't think any classical computer can do it fast enough a quantum computer can do it a lot faster but we have to have some tests to guarantee that there isn't a classical computer that's pretending to be a quantum computer and that could be done by running statistical tests on the outcome if we do many runs of the protocol and under plausible cryptographic assumptions you can argue that the classical computer wouldn't be able to fool the user into accepting the results unless they were truly random so this is an application which Google for example intends to run on their near-term platforms and there are actually a lot of interesting theoretical questions about whether we can do this kind of thing better another application which is very natural which I'll talk more about in a minute is simulating the dynamics of quantum systems that's something that Computers do naturally now I've emphasized 50 cubits as a milestone for quantum information processing platforms but actually we have a device now with a few thousand cubits that's the d-wave systems machine it operates according to a different paradigm than a general-purpose quantum computer it's what we call a quantum annealer but it does solve optimization problems however we don't currently have persuasive theoretical arguments or convincing experimental evidence that quantum annealer can achieve speed ups compared to the best classical computers running the best algorithms to solve the same problems what's starting to become available or what we call non stochastic quantum annealer z' that just means it's a system that has a sine problem which makes it intrinsically harder to simulate classically and with these non stochastic devices there may be greater potential for demonstrating convincing speed ups but so far the theorists have not been very successful in characterizing the power of these quantum annealer x' so we're just going to have to do more experiments with them to investigate whether speed ups can really be achieved with that type of technology there's a lot of interest in quantum machine learning it's natural to be curious about what happens when we combine machine learning with the advantages of quantum processing and it may be that a quantum network will for some purposes be easier to train than a classical one but we don't really know we're gonna have to try it and see how well it works one reason for being cautiously optimistic about quantum machine learning is the idea that we call QM or quantum random access memory and what that means is that it's possible in principle to encode a long classical vector very succinctly in just a logarithmic number of Q and then make use of the classical data is stored in that qrm to speed up certain applications but most of the proposed quantum machine learning applications run into input-output bottlenecks so in particular if I want to load some classical data into QM that's actually rather expensive and it could nullify the potential quantum advantage of working with the data once stored in QM and what comes out of the quantum network is a quantum state to get classical information from it we have to measure it and we can only get a limited amount of information per shot in that type of protocol it might be most natural to think of quantum machine learning as a situation in which both the input and the output or quantum and we're making use of the quantum network to solve quantum problems for example to characterize or control quantum systems so one example is there's interest in metrology in using entangled networks of sensors for improving the signal-to-noise in some types of measurements but we have limited theoretical understanding of how to do that you could try to optimize a sensor network by using a quantum machine learning protocol and another application is you might have a representative of some quantum phase of manner but it's hard to recognize what that quantum phase is because there's no local order parameter that you can by making local measurements distinguishing one phase from another but if you had a sample of that type of quantum matter you could input it to a quantum machine learning network which in effect performs a sequence of coarse graining steps with some intrinsic error correction and eventually tries to identify what phase it belongs to now one interesting application which could have widespread applications eventually of quantum computing is speeding up linear algebra there's a quantum algorithm that takes as input a very large classical matrix which is sparse and well-conditioned and compete succinctly described classically and also takes his input very long vector stored in QM and what the algorithm does is it returns the result of applying the inverse of this matrix to the input also stored in QM as a quantum state and with a fixed error the algorithm runs in a time which is this logarithmic in the size of the matrix so it's an exponential speed-up compared to classical matrix inversion but the catch is that both the input and the output are quantum states states stored in QM if I wanted to learn some features of the output I could perform a measurement on that quantum state then I could run the algorithm many times to get more detailed information about the output but if the input vector is actually classical then I'd have to load it into Q RAM and storage succinctly and a much smaller number of qubits and that would be costly and might nullify any quantum advantage it's probably more promising to think about computing that input vector ourselves on the quantum computer rather than loading it from a classical database now we do think there are interesting instances of this problem we call the other of them h HL after the three authors who first discussed it it's an example of what we call a bqp complete problem that means that any problem that can be solved efficiently by a quantum computer can be transformed into an instance of this matrix inversion problem but we don't have a very good understanding of what are the hard instances for classical computers and in applications because we need a well-conditioned matrix usually there will have to be preconditioning and we have to worry about the cost of that and the the error is affected by preconditioning in any case it doesn't seem likely that this matrix inversion algorithm will be feasible with NIST devices because it's just too expensive and requires too much quantum processing to be done without error correction now one case where we're pretty confident that quantum computers find the problem hard is simulating highly entangled quantum systems and we think that's hard because smart people have been trying for a long time to find better algorithms for simulating quantum evolution of highly entangled states and they haven't succeeded and quantum computers will be able to do such simulations and in the long run that's going to be an important application certainly very interesting to physicists and through the applications to chemistry it could have impact on pharmaceuticals on power production on agriculture and so on but as far as we can currently see those high impact applications probably won't be possible in the near term with these Mis devices what classical computers are really bad at is simulating quantum dynamics at predicting how a highly entangled quantum system will evolve in time so that's an arena in which we think quantum computers will have a very big advantage and physicists are hoping we can learn interesting things about quantum dynamics in the near term with misc technology so the following analogy may be instructive back in the 60s and 70s when it first became possible to simulate nonlinear dynamical systems with digital computers that led to an explosion of interest in and insight into classical chaotic dynamics at this point we don't understand very much at all about quantum chaos because we can't simulate it with quantum platforms we'll be able to do that and physicists are hopeful we can deepen our knowledge of quantum chaos in relatively near term experiments using this devices now I should make the distinction between Digital and analog quantum simulation an analog quantum simulator means a quantum system with many qubits whose dynamics resembles that of some model system we're interested in studying and the digital quantum simulator is a general-purpose quantum computer which in principle can simulate any physical system of interests if we program it in the appropriate way analog quantum simulation has actually been an active area of research for some time for over 15 years and digital quantum simulation is really just now getting started in a serious way but we can use similar experimental platforms for both purposes like ultra cold atoms or trapped ions or superconducting circuits these analog simulators are becoming increasingly sophisticated but they are intrinsically limited by imperfect control of the system we're simulating they're best suited for studying robust properties of a many-body system which are not so sensitive to errors that are introduced in the simulation we can anticipate that eventually digital quantum computers were so will surpass the analog ones as happened in classical computing because the digital computers can be error corrected but that might not happen for a long time so we should be thinking about what are the types of problems that we can solve with relatively noisy analog simulators and what in the near-term are the potential advantages of doing digital simulation instead so in particular on a digital computer simulating time evolution is pretty expensive because you build up time evolution as a sequence of many small time steps and that requires a pretty deep circuit on the other hand digital provides more flexibility in the systems that we can study and the initial States that we can prepare and potentially we can use hybrid quantum classical methods to offload some of the tasks to a classical computer for example in the state preparation the way I think we should look at it is experienced with these near-term digital quantum simulators will help to lay the foundations for more sophisticated simulations we'll be able to do in the future when we have fault-tolerant and scalable quantum computers and really maybe the way we should be looking at misc algorithms generally speaking I should say that today's analog quantum simulators really can do some rather remarkable things like allow us to take snapshots of fluctuating quantum order in highly correlated quantum phase of matter so I wanted to give a couple of examples of some recent experiments so you can get an idea of the state of the art so here's one that studied how quantum systems that start out far from equilibrium converge to thermal equilibrium what we think typically happens in a chaotic quantum system is that if you perturb it away from thermal equilibrium say by introducing some locally encoded information in the system that information will spread out very quickly and become invisible to local probes because it gets encoded in quantum entanglement involving many particles but there's another case which occurs in some systems particularly ones which are highly disordered that we call many-body localization and these systems are much less entangled and they thermalize much more slowly and they're probably easier to simulate with classical computers but there are recent experiments with 51 atom quantum simulator done by Lukens group at Harvard which discovered an in-between case where in the same point in the phase diagram both types of quantum states are seen some which thermalize very quickly as we expect in chaotic systems and others which take a very long time to converge to thermal equilibrium and that was quite unexpected and remarkable because these two types of states otherwise seem to be very similar so these states which take a long to thermalize in an non-integral system maybe a signature of a new class of matter which hadn't been discussed before exhibiting what have been called quantum many-body scars that just means a typical slightly entangle energy eigenstates in a non interval system in which most of the eigenstates actually are highly entangled so I think it's really encouraging that in this one of the first experiments that arguably studied quantum dynamics using a quantum platform in a regime which is hard to simulate classically a new phenomenon was discovered which theorists are struggling to understand here's another recent example of what you might call a programmable analog quantum simulator sort of in the middle ground between digital and analog that is it's an analog system but with digital controls this experiment was done by the Innsbruck led by Reiner plot with an ion trap quantum computer with 20 ions in the trap the ion trap has its own native Hamiltonian which is an icing like Hamiltonian describing the interactions among the ions but it's tunable and it's rapidly tunable so you can quickly switch the Hamiltonian from one to another and that makes it suitable for doing variational calculations and for solving optimization problems and what they studied in particular was the low energy states of one-dimensional quantum electrodynamics with 20 lattice sites and the idea is to do a variation calculation by starting out with some initial state having the Hamiltonian be some particular Hamiltonian for some specified time then quickly switched to another Hamiltonian let that run for a certain time and switch to another and so on and then the different times that Hamiltonians run our variational parameters which we can then try to optimize to find low energy states and they did this in this twenty site model and god was which give very good agreement with exact diagonalization of the system which is possible in the twenty site model and they were also able to build into the protocol some self verification to check that they were really finding energy eigenstates with with good accuracy so the important point is the decoherence was not a limitation at all on the accuracy of this protocol it's still in the regime which is easy to simulate classically you could try to demonstrate quantum advantage by scaling it up but the problem is one-dimensional physics is pretty easy to simulate classically so if you want to demonstrate quantum advantage you should probably either study dynamics in a one dimensional system or the low energy states of higher dimensional systems which are much more challenging for classical computers so as I've already emphasized a few times these misc Aero quantum devices will not be protected by quantum error correction and noise is going to limit what we can do with them the answer to that will be quantum error correction and that will be essential for solving hard problems but it carries a very high overhead cost both in terms of the number of qubits that we need and the number of gates we have to execute how high that cost is depends on the quality of our hardware and on the problem we're trying to solve but here's an example a recent analysis of the resources we would need to break a public key cryptosystem as it's currently used today with the quantum computer which has a gate error rate comparable to the best that is so far been demonstrated in the lab of about 10 to the minus 3 and the answer is that we would need about 20 million physical qubits to run that algorithm because of the overhead needed for quantum error correction even though the number of logical protected qubits is just a few thousand so if we're going to reach that regime of scalability we're going to have to cross a very daunting chasm from where the technology is likely to be in the next couple of years with hundreds of qubits to millions of physical qubits and that's probably going to take a while so the people eager to use these platforms are probably going to have to be patient but in the meantime it's really important to continue to make the advances in the gait accuracy the systems engineering the design of algorithms and of error correction protocols so we can sooner reach that era of scaleable fault-tolerant quantum computing eventually we will use quantum error correction to keep a quantum computation on track we won't be able to do that in the near term because it's too costly but it is important to be thinking about other ways of mitigating the noise then might be feasible in the near term if I'm doing a computation with G gates and a single faulty gate could cause the computation to fail then I don't expect to be able to succeed unless the gate error rate is small compared to 1 over G but that actually depends on the algorithm that we're running in the application depending on the nature of the algorithm and we might be able to tolerate a significantly higher gate error rate in some cases it would be ok to have a constant probability of error in each of the qubits which is measured at the end of the computation and the number of circuit locations where a fault can cause a serious error would be relatively small that might happen because it's a low depth computation or because the algorithm is designed so that errors which occur early in time tended to decay away by a later time and there are other ways in which we could try to improve the signal-to-noise by using resampling methods which would have smaller statistical noise than just random samples or by extrapolating to the zero noise limit which means to deliberately change the noise strength so that we can see how the results change as the noise string gets smaller and then try to extrapolate to the limit of zero noise even though we can't reach that experimentally so let me just sum up we're entering the nasarah well these NIST devices really be able to run applications that we care about faster than classical computers nobody knows we're going to have to try it and see how well it were in particular we'll be able to test these hybrid quantum classical schemes for solving optimization problems we should design these near-term algorithms with some noise resilience in mind in order to get as much computational power out of noisy devices as we can quantum dynamics of highly entangled systems is very hard to simulate classically so that seems like a promising arena for quantum advantage experimenting with quantum test beds can speed up progress and lead to the discovery of new applications we can't expect NIST to change the world by itself the goal for near term quantum platform should be to pave the way for the bigger payoffs we hope to get in future devices it's important to lower gate error rates that will allow us to execute deeper circuits in the near term and it will eventually lower the cost of doing scaleable air corrected fault tolerant computing it's probable that really impactful applications will have to make use of error correction and the fault tolerant computer may still be a ways off we really don't know how long it's going to take to get there but progress towards fault tolerant devices has to continue to be a high priority for quantum technology ok thanks for listening [Applause]
Info
Channel: caltech
Views: 3,231
Rating: 4.9540229 out of 5
Keywords: Caltech, science, technology, research
Id: QUGnaLh6QLI
Channel Id: undefined
Length: 43min 42sec (2622 seconds)
Published: Mon Sep 16 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.