632: Liquid Neural Networks — with Adrian Kosowski

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this is five minute Friday on liquid neural networks [Music] like the preceding two weeks for five minute Friday today I'm having a short five minute-ish conversation with a preeminent data science speaker that I met in person at odsc West in San Francisco Our Guest today is Dr Adrian kosovsky who introduces the concept of liquid neural networks all right we're here at open data science West odsc West in San Francisco 2022 I'm here live filming with Adrian kosovsky he's co-founder and chief product officer at pathway.com which is a programming framework that handles streaming data updates Adrian holds a PhD in computer science from gdance University in Poland which he completed at 20 years old and then he went into a research career in Paris at the prestigious ecolo polytechnique and inria the Computer Science Institute behind many key Innovations including the ubiquitous scikit-learn machine learning library in Python so Adrian I am fascinated by bioinspire machine learning it's something that I talk about as often as I can in my book deep learning Illustrated so I like to talk about and learn about connections between biology and machine learning and that happens to be an area of expertise for you so there's a particular term that I'm fascinated by and then I know that you're fascinated by and this is liquid neural networks so Adrian what are liquid neural networks and how could they make a real world impact John it's a pleasure to be here liquid neural networks are A New Concept which concerns a certain bio-inspired extension of recurrent neural networks cool the team at MIT behind it was first looking for inspiration at the brain of a very simple worm called sea ligands very common uh biological prototype for studying the brain indeed as there's a model organism for many good reasons one of them is that it has a super simple brain structure it's simple in that it has very few neurons only about 300 but also these neurons are very very simple and act differently than Nunes in the human mind right so only 300 neurons in a c elegans whereas the human brain has like 80 million or 90 million that kind of area sorry billion uh on the order of 90 billion neurons so simpler in that sense but also simpler in terms of structure yeah in terms of structure indeed so there are some nice studies to compare a single human and neuron in terms of computational capacity to actually a pretty large structure in an artificial neural network you would need something like several thousand artificial neurons to do the same work as a single human neuron does by contrast for C leggings the neon is really really simple some some like to say it's really hydraulic in the sense that it pushes on other neurons that is connected to Lava as water would push on another another cell so the the structure of the neuron the behavior of a neuron can be described by a simple set of differential equations which are known which are easy to describe and it's tempting to actually try to get an artificial neural network which dies to implement a similar Dynamics cool so um so this liquid neural network idea is related to the bio the biological inspiration of this C elegans worm and its hydraulic mechanism for uh for conveying information between brain cells between neurons uh and but it's also a bit of a pun because of the yeah explain why it's also a pun so I made the two sensors in which it's liquid it's liquid in the sense as that he described but it behaves a bit like liquid pushing but it is also liquid in the sense that the in the implementation of the learning process of the network of time is deleted as continuous so for most Engineers the usual way of looking at time as in in terms of discrete steps where certain transformation of Weights is applied to to the network in this case we look more at a differential time continuous way of looking at the neural network and apply a special type of learning process based on backup propagation and now to emphasize see a Legends does not directly apply backup application backup propagation as applied in the artificial simulation which which is described in particular by liquid neural networks so we're not like we are not when we're trying to design mechanisms like this researchers are not exactly trying to model biology yeah yeah yeah and yeah it's it's impossible right because um in biology the worm is is learning over time whereas when we apply techniques like back propagation to learn with machine learning we're using data that were collected over time but then were uh but we're forced to move forward in time as we com as we perform back propagation going backwards over these data points that were collected but it's not literally it doesn't it can't work the same way as it doesn't see Elegance because the CL against is actually learning over time whereas we're only retrospectively looking at data points that have been recorded as we move forward through gen that's that's the thing actually and it's also something to realize and much much broader context but the capabilities of machine learning which has to be real time which is forced to work in the real-time context uh somehow limited restained they don't exactly cover the same models as those that we are used to so in in particular if we are working with time series data and a lot a lot of us are including ourselves at pathway we work with time series data not not with bioinspired models of course but like uh with with time series data anyway and the time series uh comes at comes in the way which which sometimes allows to to look back at it from the beginning in the in the learning process and sometimes decisions have to be taken immediately so they're like two different settings with uh with with a different interpretation cool and so sounds fascinating these liquid neural networks um I love that they're biologically inspired obviously even if we can't capture all elements of how biological systems learn but uh how could these liquid neural networks make a real difference in the world how could this revolutionize parts of machine learning so I guess uh it's always safest to take small steps of the way this is looking now is that certain Inspirations improvements that have been achieved or influencing new network designs the best practices for the speeds up the little mathematical tricks seemingly little mathematical ticks which help to shave off the complexity of of transformations to uh to allow for smoother learning process so of course one of the major challenges in machine learning is related for example to how gradients propagate in in the network how to change propagates in in the network and every little take every little mathematical optimization which allows for a smoother realization of resposers that gadines don't zero out etc etc is helpful so there are many different contexts where we can get inspiration cool uh and then yeah so like uh like yeah so we go step by step take small steps but like uh are there any specific practical applications of this so far so again uh looking at the time series data was also a big um a big area which might potentially uh show show promise in in general purpose models which is related to uh creating uh reservoirs because of what Computing as a kind of pre-processing step to other neural networks which are used which use some type of bio-inspired neural network as a way to to perform a pre-processing a dimensionality increase of of the input data a dimensionality increase of the data so you could have relatively simplistic inputs pass it through a liquid neural network that would increase the number of features that go into a downstream machine learning model this could be the future of where these models are going it's uh it's definitely not where this where we are specifically with liquid neural networks there are other pattern Loop coaches which take similar biological insulations which are perhaps even closer to physical yeah cool all right well something to keep an eye on for all of us fascinating to learn about it thank you so much for taking the time Adrian to fill us in on this five minute Friday super data science episode on liquid neural networks my pleasure John thank you okay that's it for this special guest episode of five minute Friday filmed on site at odsc West we'll be back with another one of these soon until next time keep on rocking it out there folks and I'm looking forward to enjoying another round of the super data science podcast with you very soon [Music]
Info
Channel: Super Data Science Podcast with Jon Krohn
Views: 2,539
Rating: undefined out of 5
Keywords: SuperDataScience, Podcast, Super Data Science Podcast, Data Science, Jon Krohn, liquid neural networks, adrian kosowski, bio inspired learning in machine learning, machine learning neural networks, machine learning, fiveminute friday
Id: teMhaB3B6Lo
Channel Id: undefined
Length: 10min 23sec (623 seconds)
Published: Fri Dec 02 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.