Q2B 2019 | Quantum Machine Learning: Algorithms & Applications | Iordanis Kerenidis | QC Ware

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so I will be discussing quantum machine learning and maybe I start with a disclaimer that probably quantum machine learning is is kind of one of the most overhyped and underestimated fields right now so we'll try to get rid some of the hype and try to figure out ways of dealing with some of the bottlenecks so really I try to understand what is really the power of quantum machine learning and what quantum computing can do for the machine learning applications so here is the main question that I want to answer it's not clear that I will answer it but at least I want to pose the question which is when can we reasonably expect quantum computing to provide practical performance advantages over classical approaches in machine learning ok so this is a very speculative and difficult question to answer so what I will do is try to rephrase the question and say how can we actually accelerate this process ok so how can we make that it takes less time than we think it should take ok so this is what I will try to do in this talk and let me start also by saying that there are two different methodologies that one can use for quantum machine learning the same way as for most quantum algorithms the first one has to do with misc heuristics ok so I will be talking mostly about quantum ural networks and this sort of parameterized circuits okay and the second is quantum algorithms that as we said a little bit more down the road and these are algorithms that can probably do some things faster than classical algorithms and some of the things that they can do faster have to do with classification or with clustering or with dimensionality reduction these are kind of what people traditionally use in in in in classical machine learning and we will be discussing both of these topics so I am NOT a consultant but I learned how to do a 2 by 2 think so I'm very proud so here is what we are trying to do right so on one axis we have the hardware requirements of the algorithms okay on the other axis we have the impact that the application will actually have and here is where we want to be obviously we want to have small hardware requirements and we want to have high impact so why wouldn't we want to do that so here is where we kind of are we have nice heuristics that you can actually run on hardware that we have now okay the impact is not big right now they cannot solve anything that we cannot solve classically so the hope is that as the hardware involves and as the heuristics become better then the impact will also grow okay and at the same time we have the qml algorithms that I can prove to you that they have a great impact because they can solve things much faster but I will need a lot of hardware that they don't have today ok so what are we going to try to do where we're going to push both directions in order to get our holy grail and because I'm a scientist I can put a more realistic goal with the somewhere in the middle and we'll try to go there okay so let's see how we can try to reach that slightly more realistic goal so the first question I want to ask is why quantum machine learning why do I think that quantum machine learning is really a good area where I where I believe that quantum computing can actually have an impact so here are some of the reasons for being optimistic cautiously optimistic but still optimistic so the first one is that we have some very powerful quantum tools that have to do with the computational core of machine learning which is basically linear algebra so people who are familiar with with machine learning you know that at the bottom of all these algorithms whether they're deep learning and neural networks or other types of classification or clustering support vector machines and all these types of algorithms there is a part that has to do with linear algebra you're multiplying matrices you're trying to solve linear systems and things like that so we do have some very powerful quantum algorithms to solve this type of problems okay so this is one of the reasons why I think that quantum machine learning is really an area where quantum can actually offer something because this is one of the area where we do have quantum algorithms for this type of problems the second thing that it's very interesting is that as we said this morning if you want to consider a quantum state then you can see the state as some some point in this sphere for example on earth as we said it's not only all the poles but it's somewhere you can even think of a sphere not in three dimensions but in many more dimensions so the main idea is that if I give you these two quantum states that these are some points in some space the same as you have data points in your data sets one thing that quantum computers can do very fast is figure out how similar how far or how close these points are okay and this is something fundamental for machine learning for example when we want to classify things when we want to cluster things we want to check the similarity of our data points ok so this is one other aspect of machine learning that the quantum computer can do pretty fast which is figuring out the similarity or different between different data points okay so here is a third reason and to me it's very important and I think we it was also hinted in the morning machine learning is not an exact science so and this is good news for us because when we want to say that we have a quantum machine or a classical machine learning algorithm that works this means that we have algorithms that are extremely resilient to noise if you want your autonomous vehicle to be able to recognize the signals and the science in the street it has to do it in a way which is extremely resilient in noise it could be raining it could be lighting it could not be light it could be day it could be night so we have already a lot of noise in our data but nevertheless the machine learning algorithms can actually deal with this noise okay so why is this interesting for a quantum computing point of view because quantum computers will also add noise to our computation they're not going to be exact we are going to have noise machines okay but we are using noisy machine machines for foreign area and for the type of problem we're already there is noise in the data and we have to find ways to deal with noise anyways so the goal in the hope is that given that we have to deal with noise and we have ways of dealing with all this noise that exists in the data of machine learning the hope is that the extra noise that will be coming from the processes from the quantum machine would also be something that could potentially be handled okay for example if we are really training neural networks right now classically then we actually inject noise artificial into the computation in order to make our algorithms resilient so with the quantum machine we don't have to inject noise because our hardware is already noisy enough that you know they do our job for us and here is another thing that I want to be very clear about is that there are many different things that one could try to improve when it comes to machine learning right so it's only one thing to say I want to be faster so yes we can try to come up with algorithms that are faster okay but we can also try to come algorithms Alchemy's are more accurate they might take the same time than the classical algorithms okay or there are two other options too other things that are very important they're becoming increasingly important the first one is this notion of interpreter ability what does this mean it means that for many sectors for many industry sectors for example when I talk to the healthcare people or to defense so what they say is that it's great that deep learning works but they cannot really use it and I cannot use it because I cannot explain how it works and this really makes it impossible to actually deploy this type of algorithms so we have to go back to using more interpretable kind of models so more interpretable kind of of algorithms and this is what quantum can also offer something because we can make the interpreter algorithms like for example support vector machines or feature selections in a way faster and more accurate and the last one is about energy saving so we know that classical machine learning also works very well for specific tasks but we also know that this extremely energy consuming so one other area where we can try to improve is by saying that the quantum computer may not give you a faster solution or a better solution but it can really decrease your energy consumption and this is quite important the more data that we're going to be having and the more we will be using these techniques everywhere in the industry so these were the reasons to be optimistic so there are also many reasons to be very cautious about it and I will just also want to mention a few of them the first one is basically the same as the one of the reasons to be optimistic which is that we have some very powerful quantum tools for this what we call this linear algebra but they are extremely subtle meaning that it's not a black box thing that you can just press the button and say I know that it will always work better than my classical algorithm these are extremely subtle techniques that you really need to understand what is the problem you are talking about what is your data that you're running your problem on and what specific variants of these algorithms do you really need to deploy in order to actually try and get the speed up at the end ok so these are extremely subtle tools and they are not going to work always ok the second is that if we want to discuss a bit if we want to get the full power of quantum computing when it comes to quantum machine learning one of the things that we have used quite a bit is the ability to take all this classical data that we have and somehow load this classical data in a quantum way so that the quantum computer can actually do this computation is much faster ok so there is this issue of how do we take the classical data and map it into some quantum states that can be manipulated by these quantum algorithms and the quantum hardware so this is not a simple problem ok but I will show you a little bit further down the talk how we can try to to actually do something interesting about this the third the third point that I wanted to make is that when we try to solve this this machine learning problems with a quantum computer what we get most of the times out of this quantum algorithm is not a classical answer but it quantum output that somehow encodes this classical answer that we want into this quantum system so somehow we need to go into this quantum system and by performing some observations or measurements we need to extract the classical information that we really care about from these quantum states from these outputs of the quantum algorithms again depending on what type of information we may need to extract it might be very simple or it might also take a lot of time so this is also something that we need to take care about not on one hand how do we take our inputs and put them into the quantum algorithm in the other hand how this quantum output can actually come out as a classical answer that we can use in the in the remaining workflow and here is the last one so as I said machine learning works because it works in practice it was also said in the morning that maybe what we care about is not proving that theoretically it should work but actually see if it works in practice and the reason that people believe that it works in practice is because they use it with real data and they get real outcomes and they figure out whether they actually it actually works well or not okay so here many of these quantum machine learning algorithm most most of all these quantum heuristics their proposals of ways of trying to do things but we have there is a big caveat which is that we don't have big enough hardware to actually test whether they actually work well in practice or not so it is not very easy to benchmark a quantum heuristic if not on one hand you don't have a mathematical proof that it works but that's okay it would have been okay if you could actually see whether it works in practice or not but for that we need also hardware so we are in a case where we have there are many ideas out there but it's difficult to figure out which one of them are going to work or not okay we have some idea by trying to implement very very tiny examples of this but the more the hardware will getting better the more we realize and we'll understand what type of heuristics work better and what times of heuristics do not work so well so we are waiting for hardware in order to be able to benchmark a lot of these algorithms okay so how do we enhance classical machine learning so I will discuss briefly the niche heuristics and then I will go more into the algorithms I wanted to tell you that on the third day on Thursday in the morning we have a very nice session with scientific talks that are focused on the niche turistic for quantum machine learning as well so there are many different proposals for what the quantum Ural network should look like okay I cited a few papers but there are even more papers than this here is what they look like I don't expect you to understand many of these I don't understand half of them either but somehow what I wanted to tell you is that given the fact that it's not very clear how to test what things work or not there are many different proposals that we are in the process of trying to understand how well they can actually perform so let me try to just describe in high level what this quantum Ural networks look like okay so the idea is that you have some quantum circuit okay what is the circuit the circuit is something that that takes an input that could be an image of a cat okay it takes this input through this procedure through this quantum circuit and at the end the circuit should tell you this is a cat or this is a dog right so I'm caricaturing a bit but let's say this is one of the things that one would want to do in machine learning so how are you trying to find this circuit that would actually do this thing like when you give it a cat it should say a cat and when you give it a dog it should give you a dog right so what you do is that you parameterize the circuit with a knob with a number of parameters or variables these Phi's and fetters over there in the beginning you randomly assign values to this and you try to figure out if your circuit works or not so obviously it's not going to work in the beginning so what you do is that you try to understand how you can improve these parameters in order little by little to come up with with a circuit that you believe that it will recognize the cats from the dogs okay so this is what the training of this quantum unilateral will be is figuring out exactly what are these parameters in these gates that you apply into this quantum circuit so this is not very different in some sense with classical neural networks we're still you you have some sort of circuit that you input an image you need to get at the end a label and what you do when we say about training your networks is exactly figuring out what parameters we need to put into these edges of the neural network in order for the classification to happen with high probability okay so as I said is a very interesting research area but there is a lot more work that is needed to be done right so the hope here is that given that these quantum states live in a much much higher space what we call this exponential hilbert space the hope is that when you train these quantum Euler networks instead of classical neural networks you can take advantage of the fact that you are in a much bigger space meaning that you can come up with models that were not even possible to come up with in a classical world and figure out ways of doing this this type of tasks in by using richer models and hopefully achieving higher accuracy when you are doing this type of tasks okay so they seem to work okay for extremely small instances so this is good news in the sense that if you couldn't even make them work for very tiny inputs then the hope that they will actually work when you scale them up then it would not be great so they do work okay for small instances and simple kind of data sets so the hope is that they will continue to work for better for big distances a larger hardware as I said is very difficult to benchmark because we are missing the hardware okay when it comes to this heuristics one other thing that it's important for me to say is that these are heuristics that are good that we hope they're good for some specific problems but these are not something that will solve all your possible problems like is like I give you a screwdriver and you try to nail to put a nail on the wall that's not what it's supposed to do so these are heuristics that one can hope that they work well for problems like classification okay they are pretty surely not working well for other problems where even classical mural networks do not work well for example right okay so what I try to say is that we have all these different types of ways of defining quantum Euler networks they are very interesting and the main hope is as the hardware also becomes better that their impact will grow because on one hand we will be able to understand which ones of these heuristics actually work we will be able to test them in larger instances using the bigger hardware and hopefully we can move these techniques towards something that it makes sense so let me go now to the quantum machine learning algorithms so here is one of the first applications that we came up with a couple of years ago so it kind of made the news for many different reasons so so here is how we define this problem the problem is a very simple problem you have a number of users you have some products and what you want to do is you have this matrix where you see if a user likes a product or not and then what you would like is to find a product from for which you don't have information if the user will like this product or not and recommend the product that with high probability the user will have a lot of utility for right so for example this is what you do know what Netflix does if you watch this movie and that movie then you should possibly like the third movie and this is what the recommendation will do ok so I'm using here a very schematic way of talking about it so I'm assuming that we have a hundred million user a hundred million products but as you know even though we think that we are all very unique we are not very unique there are not so many different types of users so if you want 99.5% of all users fall into some hundred categories and if you have something like that which is the basically the parameters that you have for something like the Amazon preference matrix then the classical algorithm steps to run okay and this is what we did a few years ago so we found out a quantum algorithm that will use a thousand steps to run okay so obviously a quantum step by it might be slower than a classical step so this is not exactly what you expect to see in practice but radically this is the number of steps that they plan to malgor if we need which is one thousand versus one trillion so this is quite big of a gap and the hope is that part of this big gap you can actually see it in practice once we have this quantum machines running okay what happened after we we proposed this result this result had what we call an exponential gap between the classical and the quantum case okay so then there was a question is the exponential gap the best you know is there really an exponential gap between the classical and the quantum case or can classical arguments actually do better as well right so then there was a new result which was called like a quantum inspired result by entangling last year which is an extremely interesting theoretical result that actually says that the gap that we thought was exponential is not actually exponential but it's only polynomial so you saw a little bit from last talk that exponential is good polynomial is bad or let's say polynomial is worse but when you actually apply these results for the specific type of matrices the hundred million have been million and stuff what you get for the quantum inspired algorithm is that you need 1 billion of trillion trillion steps so even though we got from an exponential gap to a polynomial gap the actual gap between the quantum algorithm and the classical one is even more than what it used to be with your classical ones ok so why am I saying all this I'm just saying that quantum algorithms and classical are gonna static are extremely important to get into the details and figure out really what does it mean for my application it's one level of understanding is that a polynomial gap or is there an exponential gap this is very interesting and very important to understand but please continue to talk to the scientists and continue to get information about your specific use cases this is the only way we can hope to have real quantum applications is if we start talking about the specific use cases and how all these gaps translate in reality okay so there are more examples image recognition this is something that we did last year we started from the most basic and simplest data that exists out there in the machine learning word is this hundred and digits we use the amnesty data set one can see if you just multiply all these things that the classical thing can take ten trillion steps and the quantum we can take 10 million saving this is a smaller gap than we had for the recommendation system but still a decent gap between the two which means that if we put all the overhead that the quantum thing needs maybe in practice we will actually see some difference the same thing with clustering I just want to go into this so we talked in the beginning of the talk about quantum Euler networks this is trying to come up with one tomb analogs of the classical Mueller networks there is another thing that we try to do which is go back to the classical mirror networks and why is that because we know that they work and the question is can we speed up the training of this classical neural networks okay so here we are not trying to define new quantum ways of doing your networks but we look at the classical ones and we say can we speed up the training because this is the real bottleneck in deplaning how do you train these classical Mueller networks and we do have some results that show that you can get some speed ups when it comes to the classical versus the quantum training of the neural networks okay so for now all the algorithms are showed you are somewhere there at the very top right corner meaning that for now I did not tell you anything about how easy it is to actually implement these things or how fast or in how many years or what type of hardware requirements you have and the first answer is that if you just read the papers and you stop there then they do need many qubits and they do need many years before we will have the computers to actually run them okay but here's what I want to do and I want to finish the last few talks about this how do we bring this closer to the hardware requirements that we have today or that we will have in the mid scale and I want to make this very clear that if we try to accelerate the the moment when we will really come up with real-world applications of quantum computing if we just say the algorithm it's 10 million cubits so let me just wait until the hardware which is 11 million cubits then this is not going to go very far so what we need to do is work on both sides the hardware should be getting better but should be getting better in a user driven way we need to tell the hardware what we want to solve okay and we need to take all these algorithms that we expect them today or yesterday to use one million or ten million cubits and figure out how do we bring them closer to what the hardware can do maybe not now but in three years not in 25 years so here are some of the things of what we are trying to do so you know what nice keys I think it's so popular now that we can even use it as a verb so we're now Niska find the the qml algorithms so one aspect is as I said how do we load classical data into the quantum machines this is one of the assumptions that we made in order to get all these amazing speed ups that we had how do we do that well there are many proposals out there they are not very practical there are also these are also two roms that I found on the internet so you can either think of them like that but one way of dealing with this thing is to start saying that no this is very unrealistic this is impossible the hardware will never be done so quantum machine learning is never gonna work so this is exactly what also people were saying ten years ago or 20 years ago about quantum computing that you know could be unrealistic and impossible but we are here today so the way that we want to think about it is that if if the hardware is too difficult to build let's go back to the drawing board and let's fix the algorithms it's not only the hardware that needs to get better it's also the algorithms that need to get better so we went back and we said do we really need everything that this Khurram is supposed to be doing that makes it very difficult to implement or can we get away with something which is simpler to implement and this is what we came up with this quantum data loaders they're as powerful as this random access memory things are in the sense that we can use them to do what we need it for the algorithms for solving the use cases and they're much simpler to implement and also they have much more robust to noise the same thing with linear algebra linear algebra also is something that if you just look at the papers it's something that it's not very easy to implement you do need large quantum computers and most probably fault tolerant the question is okay do we need the full blown quantum linear algebra if we want to really solve machine learning problems or can get away with with simpler things so we try to look much more in detail in this and yes if you want to train your networks for example or if you want to do classification you don't need to use the most elaborate quantum algorithms that do linear algebra you need specific quantum algorithms that solve matrix multiplication fast and very specific operations and for that there are much better circuits with fewer qubits and much more shallow Serkis than one would think in the beginning the same thing with everything with distance estimation with extracting the classical information the main point I want to make is that if we need to bridge the gap between what we can do today and what we hope to do tomorrow then we need to work on both sides we need to improve the hardware but we need to improve it in an intelligent way that makes sense for the applications that we need to use the hardware for and we need to go back and we need really algorithmic expert this to look at the algorithms and bring them closer to the nice gala so let me finish with this when can we reasonably expect quantum computing to provide practical performance advantages over classical approaches in machine learning this was a mouthful but I'm not going to give you a number but what I think what I missed enough to say that it should be sooner than we initially thought and we'd work both on the hardware and on the software to make this actually work so thank you very much so there is time for one question before lunch can understand your very last point can you give us the date the month or the day our date well at least at least at least give us the date that you quote initially thought oh I see so I'm one of these very yeah going old people that started doing quantum alchemist in 1999 so I have 20 years working on this and when I started doing quantum Alchemist in 1999 if someone would have asked me when are quantum computers you know going to happen I will just laugh and say I don't care I'm just doing math so at that point it was infinite the time right now if you go back and you see this for example this h HL algorithm nu and you look at if you just naively try to implement it how many qubits do you need then you are getting on some millions of qubits ok so what we're trying to do is figure out how to solve specific use cases with Alchemist that will need much fewer qubits than that ok and get as much power as we can from whatever hardware is available but also drive what architectures the hardware should also be trying to do I really believe that the hardware and the software people should be getting together and it's not the difference of solving a problem is not hundred qubits or two hundred qubits is what can these qubits actually do and if you know what you're trying to before you are building this machine you can build it much better
Info
Channel: QC Ware
Views: 2,511
Rating: undefined out of 5
Keywords: quantum computing conference, quantum computing, #q2b19, quantum computing software, Iordanis Kerenidis
Id: UZ9VukrhLg8
Channel Id: undefined
Length: 31min 19sec (1879 seconds)
Published: Wed Apr 15 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.