How is AI helping to fight the coronavirus COVID-19

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this special video is dedicated to my dad dr. Bala Nagarajan who passed away on this day March 19th two years back and spent his whole life on researching cures for cancer at the Cancer Institute in Chennai we miss you dad what's the role of AI in the corona virus pandemic this is going to be one special video outside my regular sequence because it's important for many of us to understand the current coronavirus crisis in the big picture so we can all pitch in in different ways as of March 18 20 20 20 300 GMT our there was currently two hundred and eighteen thousand hundred and fifty six confirmed cases and eight thousand nine hundred and thirty-nine deaths from the corona virus outbreak the death rate of 4% is alarming so it's good that we have AI helped us out with its containment we know a couple of things about the outbreak in the u.s. even though we have the best technologies in the world such as the state of art AI the bureaucracy of different government agencies got in the way and the work was delayed by about a month we know that spread is exponential as a recent simulation under different conditions created by The Washington Post shows so containment is critical on the virus ecosystem I want to touch upon seven different areas where AI is used but focus a little more into only one of them one the vaccination development including the infrastructure clinical trials and commercialization ai is used to find the right vaccinations faster by analyzing prior ones based on similarity measures of protein structures to the infection and spread of the corona virus this is area will focus on to understand how data and AI helps us answer some of the critical questions three diagnosis and treatment at the health centers where machines use AI some chest x-ray scanning systems can automatically detect the virus using image recognition for post treatment which includes post care and insurance payments AI is used for a faster payment processing five then we have the regulators and the government agencies that collect make available and process data across multiple entities six we then have researchers who use that data and other data for creating better drugs analyzing the impact of medicines and so on seven finally there are the people who formed the most important part of the healthcare ecosystem they might have access to information for self diagnosis use mobile apps and interact with the ecosystem one mobile app for example helps users check if they have the virus by feeding in some user input getting some data automatically about their location to rate them on a degree of risk you can imagine a situation where if a confirmed patients mobile location is known at all times then it's possible to identify all the other persons that this patient came in contact with in this video though I'll focus on AI on the infection and spread of the virus data for coronavirus studies are available from many government organizations the white houses for example setting up a data hub called the cord 19 from hospitals and other sources like the ones provided by Johns Hopkins there are two main types of data one is textual data and the court 19 includes over 24,000 technical articles and data about adverse effects and such the other type of data is numerical data that includes how the virus spreads and is being treated data is being added each day to these datasets let's talk about the textual data it's impossible for humans to read through all the literature and extract critical information so natural language processing or NLP which is a branch of AI is being applied on this vast data set to extract useful information about the virus we can use NLP on this literature data to understand the protein structure develop vaccinations faster understand treatment options and targets predict adverse effects determined dosage and so on one of the latest algorithms for text processing is called Bert open sourced by Google this algorithm overcomes the limitations of prior NLP algorithms by looking at words and sentences from both directions left to right and right to left so that they can understand the word in its full context sentences are mapped to vectors or points in multi-dimensional space but by giving more context the vector falls closer in proximity to other vectors that convey same or similar meaning but for example the word SiC in the following sentences is mapped to two different locations because the meaning of the sentences are different I am sick so take me to the hospital is different from I'm sick of my boss on a sidenote I helped architect an AI platform for a company called data foundry where we built a medical language processing application to answer such questions from text data using NLP other forms of textual data or social media data where people share and talk about the coronavirus Google searches Twitter feeds and so on the unstructured nature of textual data is what makes advanced NLP a great tool to deal with them now let's talk about the numerical or semi numerical data so my numerical means that we have a way of converting them to numbers easily for example gender such as male/female and other can be easily represented by numbers like 1 2 & 3 respectively typically this data is a little more structured data sets from Johns Hopkins kaggle etc follow a structure and you can think of them as a simple table most tables so are rolled out and data is not reported at the individual patient level but the more detail we have the better we can apply ml algorithms or machine learning algorithms on top of them as an example one of the tables has the following columns rolled up at the county level the name of the county its latitude and longitude number of identified cases the number of deaths and dates of these occurrences a more granular data set at a patient level there's one available from China includes additional information such as the patient's symptoms other medical conditions their age gender and so on in the former case since we have county information we can also combine this with demographic information and perhaps weather information from other sources to make new inferences data robot a company in Boston did just that and concluded that based on initial the virus seems to affect more affluent people first because it's likely that they can't afford to travel more now that's an interesting find the essence of this table is that each row becomes a point in multi-dimensional space and all machine learning algorithms look for patterns in that space with this data we can do two things that machine learning is good at one is classification and the other is prediction with classification we try to find clusters and to answer questions about patterns for example what are the characteristics of people who initially got the disease or what are the locations that are more susceptible to the virus or what ages of people died disproportionately with prediction we can try to project the spread of the virus over time as we can possibly put stronger mitigation measures in its paths or estimate which health centers will be overwhelmed so that we can be prepared ahead of time with better processing and supplies for them as with any machine learning algorithm we find that some factors are more important than the others in slowing the spread for example we've been asked to maintain social distancing clean possibly contaminated surfaces like door knobs with disinfectants stop gathering together in large sets reduce traveling and so on among these which ones are more effective than others relatively speaking if we had historical data on a previous outbreak we might apply a technique called the principal component analysis on the data to figure out which measures actually matter the most as we build models for classification and prediction we also need to consider other things new data is coming in at a much faster rate which means that the model has to be constantly refreshed if we find new attributes for the data then the model has to be rebuilt or even different types of models have to be trained with the new data so in this coronavirus context the model is only valid for a short duration because external factors continue to change on how this disease spreads and is treated for example with more awareness about social distancing the original project projections and predictions of the model could be wrong that's why companies are continuously refining and rebuilding their models every day with new data also models that work in China may not work in the US because of different conditions likewise a high-level country model will not be useful to make local predictions I went a little bit deeper in one part of the spectrum of a full possibilities in this particular video but as an architect and an AI technologist I wanted to merge these two disciplines together to bring you the whole picture so that you can dive deeper into areas that you are specifically interested in and perhaps contribute to the mitigation of the virus thanks for watching [Music]
Info
Channel: Raj Ramesh
Views: 18,156
Rating: 4.9178643 out of 5
Keywords: #aiandyou, coronavirus, ai, artificial intelligence, machine learning, ai in community, business architecture
Id: pOCvJKvulyU
Channel Id: undefined
Length: 12min 20sec (740 seconds)
Published: Thu Mar 19 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.