Signal Processing and Machine Learning Techniques for Sensor Data Analytics

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
welcome to this webinar on signal processing and machine learning techniques for sensor data analytics using my up my name is Gabriella Boone Kayla and I'm part of the product management team here at MathWorks my background is in signal processing which is also the area where I spent most of my career here helping engineering scientists apply my lab to their own challenges in this webinar I'm going to discuss the application of some standard MATLAB techniques for research problems and product design workflows that require the joint use of machine learning for example a stirring or classification methods on signals or time series meaning sampled 1d values vary over time as we'll see much of the complexity of these problems stems from the need of a fair bit of domain knowledge in both machine learning and signal processing and that often poses a challenge whether you're unfamiliar with other signal processing machine learning or both or neither I hope the session will help you understand how mark can greatly accelerate the algorithm design workflow for these problems we are also seeing a growing industry trend in pushing machine learning algorithms along with signal processing onto embedded devices and closer to the actual signal sensors here I'll refer to the class of applications as sensor data analytics or embedded analytics because the development of such products poses additional engineering challenges I thought it'd be useful to allocate some time at the end of the session to review the additional support the mapa provides for those types of workflows so here's the course list of what we look at we'll spend quite some time revisiting how common signal processing methods can be applied to both preparing or pre processing signals and to extracting descriptive features I will shot is a lack train and test a classification or pattern recognition algorithm in MATLAB including some simple approaches to scale-up performance or computational intensive problems doing this will give us an opportunity to explore a lot of basic mallove features from interactive apps to intuitive language constructs that enable that accelerate algorithms development workflows and finally we will discuss the model capabilities for the design of online predictive systems including the design and simulation of DSP functionality and the generation of source C code from predictive algorithms to have the running on embedded architectures now I'll come back to the slides later but I'm going to spend most of this hour discussing a practical example writing up what you're looking at here are the three components of the accelerometer signal capture using a smartphone the signals are generated by a subject wearing the smartphone in a fixed position of their body and engaging on different physical activities we are running an algorithm as if it operated on live signals but in this case we're using labeled recorded data for validation so we get to know the ground truth or we're also trying to automatically understand what activity they're doing purely based on signal processing and machine learning methods and as you can see most of the time were successfully guessing their activity now I'd like to make a simple point in that this is just a simple example working with accelerometer data but the techniques that I'll discuss are relevant to a wide spectrum of applications and to most types of sample signals all time series to make my point I've collected a short list of examples that I came across personally working with my lab users which broadly speaking use the same types of techniques these already capture use cases in a number of different industries like electronics aerospace defense finance automotive again even this is just a random list examples the relevance of techniques that we're discussing here is much wider I'll take advantage of one last slide before going back to MATLAB to review the different pieces of the examples that we've just seen we take the three components of the sample acceleration coming from a smartphone and we predict the physical activity of the subject as a choice between six different options or classes walk in walking up stairs walking down stairs sitting standing and laying the prediction is done through a classification algorithm classification describes a popular class of machine learning algorithms the key ideas guessing or predicting the class of a new sample in this case a signal buffer based on previous knowledge of similar data the weight works is that first the algorithm is trained with a large set of known or labeled cases optimizing it's free parameters to identify those known cases as accurately as possible once trained it can be run on unknown new data to formulate a guess or prediction on what's the most likely class for that new data in general the training phase is a lot more data and computationally intensive than the test or runtime phase so for embedded applications it is not uncommon to run the training phase on a host system or computer cluster but only to deployed a fixed pre-trained algorithm onto the embedded product regardless of training or runtime use it is very uncommon for classifiers to be able to work on Rahway firms like these in practice through each signal segment or buffer once you have a way to extract a finite set of measurements often called features from the raw waveforms the bottom line to choose what features to use is that they should capture the similarities between signals in the same class and the differences between those and different ones we'll spend most of the time left to show how MATLAB can be used to design these two big algorithmic steps starting right from the initial exploratory phase now as a quick note a key part of working through a similar task or project is the availability of a reference data set that will be a collection of signal recordings acquired in a controlled experiment and carefully labeled so that X ignorants are known and associated to the right activity for this example I'm borrowing a nice data set may available but two research groups respectively from Spain and Italy and available at the address in this slide I hope that the general problem is clear enough by now so let me go back to MATLAB in the following I'm going to assume that you're familiar with the basics of mclubbe including things like scripts and functions and basic plotting a visualization to walk you through this example I'm going to use a script formed of a number of code cells that can be executed independently I'll skip the very first cell which I use to launch my completed application at the very beginning executing this first cell loads a portion of my data set and plots it let's not get too much right now about how I'll order the data produce this blood now let's take a look at the data that we have available we have a vector AK containing the samples of the vertical acceleration for a subject over a period of time the revisit itself has 3d acceleration recordings for 30 different subjects the acceleration will sample at 50 samples per second last meaning of the sampling frequency variable FS we also have the time vector T with the same length of AK which is good TNR can be plotted one against the other and the plot shows us that for this subject we have around 8 minutes worth of samples note that time here is regularly spaced in some application that may not be the case or samples may be missing but keep in mind that MATLAB is plenty of techniques to regularize pre-process those types of signals the other obvious things notice is the other long vector art ID shorthand for activity ID which is telling us what activity each data sample corresponds to as an integer between 1 and 6 we can interpret those integers by looking at the remaining variable act labels so one is walking - walking up stairs so we walk it downstairs and so on so the plot here looks very similar to a final objective which is identifying the activity given a portion of signals but remember this is known the labeled data we're only visualizing available information what we want to do is design a method that can learn from this information to get it on new data without previous knowledge so how could we do that all first attempts will try to use intuitive approaches so for example in this plot you already see that the acceleration wait from does indeed look different depending on the activity right you can see that almost all activities have an average over on 10g or a 9.8 while one is more on 0 and guess what that's lame because the body has a different orientation with the gravitational field there than 3 your fees that look fairly static no surprise that's sitting standing and laying while the other three appear to oscillate up and down a lot more so one could start by doing something very simple just use some statistical measurements on a portion of consecutive samples regardless of how they're distributed in time for example looking at the distributions of the walk-in and lane samples respectively when considered just computing the mean value in comparing to a threshold say 5 here would give us a pretty good chance of making the difference between the two similar considerations between say walking and standing but in this case we probably want to measure the standard deviation and compare it to something like 1 or 2 meters-per-second-square but what about if we had to work out the difference between plain walking and walking upstairs in this case mean and standard deviation or the width of the distribution look very similar here what you should really consider is some more advanced analysis of how values vary over time and look at things like say the rate or the shape of the oscillations and intuitive reason for that may be because people move faster when they will downstairs say compared to one day we woke up set or the types of movements differ for different activities this is precisely where signal processing methods start to be part of the picture before going there let me make a point in passing on what I've just done I just casually drawn three histogram plots and quickly discussed their meaning even only this task could be a fairly hard one if you had to do it by hand from scratch but this type of things are available in my other single functions so despite using a pre-edited function of mind to glue those two plots in a figure and give them the right colors inside it a single call to the histogram function is doing all the hard work for me histogram is supposed to be all hist was introduced and released our 2014 b of matlab and it provides a new more efficient way of plotting instagrams now to analyze variations over time we want to focus on the acceleration caused by the body movements it's reasonable to assume that the body movements produce faster variations while the gravity contribution is often almost constant if I had two contributions that are blended together into a single signal and I want to separate them out then a technique that often applies is digital field during in this case for example we want to keep only the oscillations quicker than about one cycle per second say which is a rough figure for the average number of steps per second and discard contributions with slower oscillations using the right jargon that requires designing and applying to our data and appropriate high-pass filter I'll repeat these ideas as we go through the process now designing and applying a digital filter it's hard unless you have the right tools the design phase in particular requires quite a bit of math and a lot of domain-specific knowledge in MATLAB there are many different ways in which one could design a digital filter for example one we choose to do it entirely programmatically which means only using MATLAB comments or through built-in apps let's first take a look at what the latter would look like using an app is generally a great idea when you're approached your problem for the first time to do that I go to the apps tab of the MATLAB tool strip and I scroll down to the signal processing and communications up group in here I will pick the filter design and analysis tool for more advanced filter design you may also want to try the filter builder up the filter design and analysis tool comes with several sections for example this filter specification Spain will help us specify the right requirements to our filter down here to the left is where we start to define what we're looking to achieve in this case I'll select a high bus filter you can see that a lot of other shows is also possible down here there's a more technical choice if you know about the dual filters you'll probably know what of NFI rni are mean and the design methods listed here will probably resonate quite a bit here I'll skip the details I will just choose one of these iír options moving to the right here all I need to do is capture my requirements using the specification pane above the things I have to say include we're using example in frequency of 50 Hertz we want to keep unaltered days we want to attenuate by a factor of 1 or 0 dB all signal components oscillating more quickly than one time per second or 1 Hertz let's be generous and say no point it hurts then everything to the last of this other value f-stop is attenuated by least a given number of dB I'll set this to say not point four Hertz and correspondingly this a stopped 60 DB this means that all oscillations lower than not point four times per second will be made a thousand times smaller by the filter finally by pressing design the tool does all the job for us and we end up with a filter that satisfies our requirements we have a set of analysis tools available right within this up to verify that the filter is behaving as expected for example now we're looking at what's called magnitude response of a frequency if I need to confirm this is hollering the specification I can overlay a specification mask or if I wanted to understand the transom by press of a button I can quickly visualize things like the step or the impulse response once my filter is designed what I really want to do is to use it in my lab to apply to my signal for that I can choose between two types of approaches I can export the filter into my MATLAB workspace as one or more variables or I can generate some outlet code that realizes all that I've just done interactively through a programmatic script the code that you see here has just been generated automatically as the header of this function is telling us however I could have just as well decided to use similar comments independently having this generator automatically for me and also help me gain some insight so that the next time around I could more quickly design a filter programmatically but more importantly this now gives me a quick way of realizing the filter from my own code just by using this function code I'm not going to discard this new function because I have a previously set version already available in my working folder are called HP filter and going back to my script you can see that I'm creating a filter of via my preset function using one line of code and the next line I'm applying the filter to my vertical acceleration that creates a new signal where we hope to find only the contributions due to the body movements if I execute this section I'm also plotting the new fill signal against the original one in the plot we can see that the new signal is now all Center on the as expected as desired and some transient behavior due to the filter is present which is perfectly normal now let me focus on one activity at a time how can I do that there's a very effective amount of feature called logical indexing that can use it and look at this say I want to isolate the smoking portion of my signal the activity type is stored in the vector act ID and I check here when apt ID is equal to one and because I had two walking portions here you only look at one the time is smaller than 250 again the results here is the vector of the same length of as my signal that I can use to select only the samples that honor those criteria and here's our walking segment we can zoom in and confirm that the signal oscillates fairly regularly or miss periodically now the question is how can I measure how quickly this is oscillating or some parameters to quantify the shape or fingerprints of these oscillations a good answer would be by looking at the spectral representation of my signal or some could say through computing its FFT much better than the F of T which is a pretty low-level operation the right phrase here is power special density that may use the FFT along with a few other bits and pieces again my best bet is to focus on my objective and see if MATLAB can simply do that for me as is the case out of the many functions available to estimate the power spectral density of a signal here I'm using the Welsh method which is pretty popular in light of code here I have my spectrum on the x axis I have the frequency from 0 to 1/2 of my sampling frequency which was 50 Hertz and on the y axis I have DB per Hertz or power density then the region when the values in this plot are higher is likely to carry the information that a master for our signals this pattern of Peaks between 0 and 10 Hertz with higher energy holds a lot of measurable information if you have a certain signal theory class you remember about the spectra of signals that are periodic or almost periodic we can see a fundamental frequency roughly around 1 Hertz and a number of harmonics at positions multiple of that frequency as for extracting information from this the distance in frequency between these Peaks is the rate of time-domain oscillation and the relative amplitudes of the peaks describe the shape of oscillations a bit like the timbre in musical signals to validate these also show you the spectrum for walking on top of the one for walking upstairs in a range between zero and ten Hertz here working upstairs produces slower and smoother movements because they're slower these Peaks are all pushed to the left and the smoothness in the time domain causes the peaks to the right of the fundamental to decrease very quickly indicating softer time domain transitions if this sounds unfamiliar think about a spectrum of periods with sine wave which has a single peak in its spectrum compared to that of a square wave which is full of high frequency harmonics once established that a special Peaks carry information we'd like a programmatic way of measuring their height and position so here's the next question how do you identify Peaks in a curve contrary to what some may think that's not trivial but a signal processing tool what comes to rescue with a function called fine Peaks that is built to do just that if we use it while providing no other information but our own spectral density then it will resent the complete set of local Peaks funny my plot but if we put some more effort in defining what we want for example how many Peaks it should return and what is the prominence that we require or what is the minimum distance between nearby Peaks that we expect then the results are much more encouraging and just using a few lines of code we now have a programmatic measurement approach that can be automated and it's highly descriptive of our signal characteristics in the example that I showed you at the beginning I was using a couple more signal processing measurements to extract other features but I think by now you get the general spirit of an exploratory approach for extracting features from signals what I did at the end of this phase was to collect all the useful measurements identified into a function so that for each musical segments available I'm able to automatically produce the collection of all my measurements or features that describe it let me show it to you quickly for every new buffer of acceleration samples in the three directions Here I am computing the mean filtering out the gravity contribution computing the RMS measuring the special Peaks as well as a couple of other things if I look at the spectral features sub function you can recognize the functions P Wells and find Peaks from a few minutes ago in total this function returns 66 highly descriptive features for every new signal buffer it is passed to as an input what I really like about it is that if I measure the net number of code lines excluding comments and empty lines that sums up to only 54 that's 54 lines of code for 66 features or much less than a single line per feature which I find indicative of how the MATLAB language is concise to the advantage of both understanding and productivity with that I think we can now say that we're halfway through our exploratory workflow we put in place a method to extract a finite set of features for every given segment of signal we now need to design a classifier able to learn how to associate measurements or sets up 66 features in this case to a class or a choice of activity among six available options to work with a classifier we first need to map all our data into the new feature based representation let me open this other script to show you quickly what I mean imagine we had threat we organized our data say 8 minutes of samples times 30 subjects in a large number of small buffers of equal length say 128 samples what we do now is that for every one of those buffers we call our feature extraction function to compute our 66 speeches and we end up with a new feature data set with as many columns as the number of features and as many rows that are available buffers because classification algorithms often need a lot of data to learn extracting features from an entire data set can take a very long time and if along this exploratory phase one decides to use different Peach's then the whole operation is starting over let me show what I mean here on a small scale let's reduce the number of data buffer here to a mere 600 and run this I start a timer before the loop and stop it right afterwards we can monitor the progress as my 600 data buffers are converted to features one after the other process time is roughly around 17 seconds later let the number of buffers grow and this will grow linearly with that now think about this the computations in each cycle of the full loop are all independent of each other so if we had more computational resources available we could start to think about distributing the burden across the computing nodes available I suspect most of you would think that would be a hard task so let me challenge that perception all I'll do here is changing my four key words of power for make sure I have a parallel pool enabled then run again my loop the buffer is now processed a synchronously by a pull of four server MATLAB worker sessions running in the background and I'm finished in a fraction of the original time the actual performing gain will change depending on the particular problem the bottom line is that because I had the product computing toolbox installed with my machine I was able to open locally number of my workers equal to the number of available cores on my machine I have four cores here but with external resources like a cluster the number can be driven up at will and then I was able to distribute independent iterations of a long full loop simply by changing 4 into par 4 like parallel for once we're done we can save our featured data set and go back to where we left our problem we left our problem at the stage where we needed to select a classification algorithm and now we have the data ready to go ahead when you need a classifier you have a choice among a high number of different types of algorithms the mapup documentation provides some guidance which types are best suited to which problems but the whole process of trial and error can be intimidating especially if you're not familiar with machine learning in general or in particular with a reasonable number of classification algorithms to address that issue from release 2015 a MATLAB has a new app called classification learner you can pick it from the apps tab but let me just load a preset featured data and open the app running the common classification learner right from my script to start I load my data and pick this option on the right-hand side here to leave out a fraction of my data set for validation now that before loading the data in my Street but also ranges as a map table and a lot the tool to associate names to my feature and display some simple statistics for each of them my data also included in act ID and activity label for each available feature vector as I click import data on the right I have a simple visual of my data points in a 2d feature space I can choose which of my 66 pitches to use for X and which for a Y and get a feel for how well my data samples are separable at this point we simply start selecting different classifier algorithms from this catalog and frame them one by one on our data set using this button you don't really need to know what these algorithms are or how they work and more parameters they need to even work because the tool selects the most market for you and if you want you can change them by using this advanced button as I hope you can see when the training completes the tool displays an accuracy summary beside each of the selected options and highlighting green that shows you the best accuracy at this stage you may also need to understand a bit more closely the performance of the classifier and this app has a few diagnostic options available right from within it like for example the confusion matrix which shows how well our predictions map to the actual known values in the data set for example a full green diagonal here and no instances outside it will indicate a hundred percent prediction accuracy as in many other cases with my lab apps you can then turn your interactive exploratory work into a bit of code to automate the same steps programmatically in our script we're using a preset version that comes exactly from the same workflow what's interesting here is a three line of code pattern consisting in choosing the setting for the classifier training it note the fit keyword in here and running it on new data to resent the predicted class we can use generated function right from within our script to return a trained classifier and use it on new unknown feature vectors I wouldn't do that just as yet because there's something more than an eighth dimension the classification learner provides an intuitive way to access a good number of conventional classifiers that shape with the statistics and machine learning toolbox an example of an infinitive way to address this problem is neural networks again in this case designing and training a network from scratch would be very complicated but the neural network toolbox provides apps and functions to get started quickly and design a functional network with only a few lines of code in the interest of time and to take a different perspective let me just share how one could use a programmatic approach in this case to do the same job here I initialized a pattern recognition network with 18 neurons in a single hidden layer with a single line of code then I trained it and returned the predicted classes in the test set in just a couple more lines if you ever came across the theory of neural networks and you'll know that the complexity of the math underneath these operations is considerable just think about using back propagation for an arbitrary network architecture and all the optimization options that you may need to consider for your cost function in this case most of the well-established algorithms just available to use so you can focus on solving a specific problem when I execute this code I get an interface to monitor the training progress also confirming the architecture of the network 18yrs in the hidden layer 66 inputs as the number of features and fix out the classes when we're done the trainer on networks available in my workspace and again through a programmatic approach I can use it to run the prediction on the whole test set portion on my data set as we did before interactively I can generate Diagnostics programmatically as in the case of this confusion matrix this reports around 92% accuracy which is pretty good along with a detailed view on how old the predictor classes match at the real known value as an example here we could notice that a lot of the sitting were confused for standing in vice-versa so that would be an error for improvement in our algorithm now let me take a step back and review what we have achieved we were able to train and use a classifier operating on high quality features extracted with signal processing methods we tackle the problem that required significant domain expertise in the two domains of signal processing and machine learning which had no single way of being addressed and could have taken a long time to solve instead this only took a few iterations I use the few different apps and algorithms that were readily available to use without needing to open any complicated book and program any Maps from scratch a remarkable result was a signal processing function able to extract 66 features in only 54 lines of code now I would like to spend the last 10 minutes or so of our session to consider a few common engineering challenges slightly beyond the algorithm exploration phase that we discussed until now imagine that our final objective was to run our predictions on signals coming from a neo acceleration sensor in this case we used an existing data set and we didn't ask ourselves many questions on how that had been collected in general getting hold of relevant data may welcome to be the first problem on our list I very often talk to engineers who assume that to acquire real-world signals and to explore your algorithms you need two different tools and when gap spending quite some time transitioning data between my lab and some external data acquisition software but it turns out that MATLAB can directly connect to a number of sensors and data acquisition devices and using that connectivity can further accelerate your discovery cycles because our example uses cameras and data from a smartphone I thought I also included a reference to free support packages downloadable format works website we shall also stream sensor signals from iOS and Android devices directly into markup now thinking about the end of a development workflow now imagine that your shiny MATLAB algorithm had to be implemented on a real-time system for example on an embedded device close to the accelerometer itself in this case not only the final real-time software will probably help to be rewritten few of the plus plus but the actual functionality of the algorithm would have to be readapted in the final product the machine learning portion will probably need to be simpler common for example for embedded classifiers to be pre trained offline and to be implemented in a lightweight version that only does online prediction the signal processing will also vary probably even more significantly for example filter that work on signals that stream from sensors will continuously accept new samples and update their internal states accordingly if the original MATLAB model didn't take into account these effects and it's possible and the final implementation could never match the performance of the original simulation potentially compromising the success of the actual end product the good news is that MATLAB is not only relevant to the initial signal analysis an algorithm exploration phase they can also be used to simulate real-time systems and generate embeddable source C code it's beyond the scope of this webinar to cover these aspects in detail but let me give you an idea what's possible in this area the quickest point to cover is the deployment of the classifier when we trained and tested our neural network classifier everything had been done through a network object that we'd call net this has a wealth of functionality attached to it and the actual code for the maps used for prediction may be quite hard to find but from my object net I can run this ghent function method and generate a simple prediction function that only models what meets the hub in real time just using basic constructs let's now take a look at the modeling of the digital signal processing on the left here and extract signal features M I have the 54 line feature extraction function that we reviewed before more signal processing function used here come from the signal processing toolbox these were extremely valuable during our exploration phase they are the best choice for data analysis tasks but they are not intended similarly the behavior of a real-time system and that's not what we had in mind when we put this code together in the first place look for example at how we feel there are signals my first consideration really is just a sight node but it may help us get into the right mindset here we can view the filter coefficients for every new single signal portion even if they're always the same we're taking the full sequence of interest at once and we assume to be operating on a pretty long one as we do that every time we assume to start with a filter with clean history or zero internal states now for comparison let me look at another way to model this process that has a real time implementation in mind and that's in this other function features from buffer dot M most of the signal processing this write-n-cite function comes from a DSP system toolbox these objects have been developed with system design and simulation in mind they may be less practical to use for signal analysis but they can be used to accurately model real-time DSP systems if we just look at this filter for comparison here it's a filter object with a notion of internal structure and as you can see by simply creating one in MATLAB one could even get bit accurate behavior by capturing the complete data type specification z' for example for a fixed point implementation the object keeps hold of its internal state and is declared as persistent so exiting and re-entering the function here we'll find it again in its previous state so really if required it can even take in one sample of the time and it would still operate it as expected as for the coefficients that are computed only once the first time this function is called when this persistent variable is initialized then they're just used at one time by calling the step method on every new buffer of data as a side-effect this filter runs very efficiently as it's initialized just once and from the second time around it only executes the strictly necessary computations to process the input these attributes make it ideal for using it with stream signals in the context of signal design and simulation a first advantage of this new system model as we may call it now is that by simulating it we can verify early on the design of our algorithms for an embedded system and check that the behavior is as expected I'm sure this visualization now looks familiar this is what I showed you right at the beginning of this session to introduce our example the dynamic simulation here offers a different perspective for example I can check the stability of the prediction in the transitions from one activity to the other or another application I may need to analyze the signal as I would do on an oscilloscope for example using triggers and markers this here is a time scope but other types of visualizations are also possible included obviously in the frequency domain with a spectrum analyzer but we won't look at that right now if we take a look at the code that produces the simulation will meet again a lot of programming practices that we've seen in the new feature extraction function so for example we use a while loop with a new data process at every iteration this code object here is what we're using from continues online visualization within the wire loop we just keep pushing in more data using the same simple constructs of that step met that we have already seen earlier at the beginning of this loop we use a file reader object in a similar way to incrementally advance on a data file without needing to load a potentially huge file in memory or to do any complex indexing into the source data we simply pass in the file name at the beginning and get a new frame of samples every iteration here I also use the buffer to help me operate on a long the data window then the system may be receiving in a single iteration all wrapped in a separate object Hideaway and indexing and use through the same step interface and right in the middle of the loop you can see the prediction function that we are simulating complete with our new DSP models and the lightweight neural network classifier beyond the ability to simulate our system online a second substantial advantage of having real time models of both the DSP processing components and the deployed neural network is that we can now automatically generate source Co C++ code from them that can be used in an embedded product and embedded prototype or simply as reference to share with the downstream software engineering team there will be a lot more to say on this including the ability say to directly generate fixed point or target optimized e but I'll just show you the general idea of how that works although the first time you could also go this workflow through a dedicated built-in app the general idea is that with this simple common code Chan we could turn our map function predict activity from single buffer into a fully equivalent open C function with no libraries attached the generated C is fully open in this case I but no effort in optimizing the generated code but a lot of features are there to do that including the ability to generate all fixed point code okay I think we've seen in action all that I was planning to show you now let me go back to my slides I'll take a step back and review what we've done along with the capabilities and tools that I've used in the various parts of this presentation signal analysis was the first area where the use of de-facto standard built-in functions saved us a lot of time signal processing toolbox is where all these useful functions came from just imagine having to implement all these formula from scratch let alone look in the map and try to understand them parallel computing toolbox let us distribute computationally intensive four loops simply by changing a for loop to a part for loop additional productivity options are available to scale up the framework that we use to larger architectures including computer clusters of the cloud the statistics a machine-learning toolbox not only allowed us to test a good number of classifiers but also to quickly explore and compare different options interactively in the classification learner app I feel that sped up considerably our discovery cycle after exploring a few conventional classifiers we also use the neural network toolbox to create a common network topology used for pattern recognition train it and test it we also generated a lightweight pre-trained version of that network which captured the runtime computations using basic constructs fully supported by the C code generation engine with the fundamentals of our signal processing algorithms already consolidated we use the number of objects from the DSP system toolbox to model the real time implementation of our algorithms we run an online simulation of our system design using objects that facilitate the streaming of data from long signals stored on disk and we use scopes that are optimized to handle the continuous visualization of stream signals similarly to how one would visualize real-world signal with a benchtop instrument as a side effect our online modeling efforts made our algorithm already a lot more efficient to execute in simulation and also made them ready to generate C or C++ source code that could be directly deployed onto an embedded processor and that's where you we use my lab coder my lab coder is a cogeneration engine that container of algorithms into fully open C or C++ source code there will be a lot to say about what MATLAB quarter can do especially about generating embeddable source code so I thought I'd refer you to a great introductory webinar available in pre-recorded format on our website that's called my up to see made easy on that we've come to an end of this webinar on signal processing and machine learning techniques for cents of data analytics I hope that's been useful in highlighting some MATLAB capabilities that you weren't yet familiar with I will aim to make the code that I used available in the coming few weeks so you can review the example at your own pace if you have to forget about everything as I said today I dobut least you take away the following three key ideas first of all our open end the project was made possible by having available an extensive range of building functions but for signal processing and machine learning that allowed us to experiment quickly with different options without having to implement any math from scratch the complimentary part of the picture was the map of environment itself from the basic visualization capabilities to the built-in apps that generate reusable code making constant use of a language that makes it easy to let advanced things happen within a few lines of code finally I take you for a tour through a set of MATLAB capabilities for transitioning abstract ideas to real time algorithm implementations we turn signal processing algorithms into detailed DSP system models that could be simulated over time and from those we generated so C code that could be recompiled or an embedded platform
Info
Channel: MATLAB
Views: 55,281
Rating: undefined out of 5
Keywords: MATLAB, Simulink, MathWorks
Id: GZ3KUPqA1JM
Channel Id: undefined
Length: 42min 45sec (2565 seconds)
Published: Thu Jun 29 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.