Keras Explained

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

r/neuralnetworks


For mobile and non-RES users | More info | -1 to Remove | Ignore Sub

👍︎︎ 1 👤︎︎ u/ClickableLinkBot 📅︎︎ Apr 30 2018 🗫︎ replies
Captions
Hello world It's Siraj and the question I get asked the most by far is How do I get started with deep learning? and it makes sense to ask that there are so many different learning paths and Tools, you could use. It's hard to just pick one and roll with it in this video I'm gonna explain why you should use a deep learning library called Karos to build your first deep neural networks and Compare it to other options Then we'll use Karos to build an app that generates text in the style of any given author deep learning only started getting really popular A couple years ago when Hinton's team submitted a model that blew away the competition for the large scale visual recognition Challenge their deep neural network was significantly better than all benchmarks Illuminati confirmed Because it used lots of GPU computation and data Others began to take notice and implemented their own deep neural networks for different tasks resulting in a deep learning Renaissance deep learning played a huge part in the biggest a I success story of 2017 alphago Google's algorithm that mastered the game of Go previously thought near Impossible similar improvements were made in fields like vision text and speech recognition Wavenet for example was a model that massively sped up improvements to speech to text and text-to-speech Resulting in lifelike generated audio Giotto was really the first widely adopted deep learning Library it was maintained by the University of Montreal But in September of last year they announced that they would stop developing for Theano in 2018 yes different open-source Python deep learning frameworks have been introduced the past couple of years and Some got lots of traction as of now tensorflow seems to be the most used deep learning library based on the number of github stars and Forks as well as Stack Overflow activity But there are other libraries that are growing passionate user bases as well pi torch is a great example It was introduced in January 2017 by Facebook they basically ported the popular torch framework which was written in Lua to Python the main driver behind Pipe torches popularity was the fact that it used dynamic Computation graphs that means they are defined by run instead of the traditional Define and run when inputs can vary like if we're using unstructured data with text This is super useful and efficient when it comes to static graphs We first draw the graph then inject the data to run it. That's defined aedra for dynamic graphs The graph is defined on the fly via the forward computation of the data That's defined by run But in addition to tensor flows main framework several companions libraries were released including the tensorflow fold for dynamic computation graphs and tensorflow transform for data input pipelines the temperature flow team also announced a new eager execution mode which works similar to pi torches dynamic computation graphs But wait other tech giants have also been getting in on the game Microsoft launched its cognitive toolkit last year Facebook launched cafe to Amazon launched MX net deepmind released sonnet there's also deep learning for J-d libe h2o AI and spark oh and Facebook and Microsoft announced the Onix open format to share deep learning models across Frameworks for example you can train your model in one framework, but then serve it in production in another one I know I know I know deep AF overload but look the best way to learn how some AI Concept works is to start building it and figure it out as you go And the best way to do that is by first using a high-level library called chaos Chaos is effectively an interface that wraps multiple frameworks You can use it as an interface tensorflow Theano or CNT k it works the same no matter what back-end you use Francois chalette a deep learning researcher at google created it and maintains it last year Google announced It was chosen as the official high level API of tensorflow when it comes to writing and debugging custom modules and layers Pi torch is the faster option? While Karros is definitely the fastest track when you need to quickly train and test a model built from standard layers Using chaos the pipeline for building a deep network looks like this you define it compile it fit it Evaluate it and then use it to make predictions consider a simple three layer neural network with an input layer hidden layer and output layer each of these layers is just a matrix operation input times await a Tobias and Activate the results repeat that twice and get a prediction Deep networks have multiple layers. They can have three four five whatever That's why they're called deep and these layers don't have to use just one type of operation There are all sorts of layers out there for different types of networks Convolutional layers drop out layers or current layers the list goes on But the basic idea of a deep neural network is applying a series of math operations in order to some input data each layer Represents a different operation that then passes the result on to the next layer so in a way We can think of these layers as building blocks If we can list out all the different types of layers we can wrap them into their own classes And then reuse them as modular building blocks That's exactly what Kerris does it also abstract away a lot of the magic numbers You'd have to input into a deep Network written in say pure tensorflow when we define a network They're defined as a sequence of layers using the sequential class Once we create an instance of the sequential class we can add new layers where each new line is a new layer We could do this in just two steps or we could do this in one step by creating an array of layers Beforehand and pasting it to the constructor of the sequential model the first layer in the network must define the number of Puts to expect the way that this is specified can defer depending on the network type think of a sequential model as a pipeline with your raw data fed in at the bottom and Predictions that come out at the top This is helpful in Cara's as concept that were traditionally associated with the layer Can also be split out and added as separate layers clearly showing the role in the transform of data from input to prediction for example activation functions that Transform a some signal from each neuron in a layer can be extracted and added to the sequential class as a layer like Object called activation the choice of activation function is most important for the output layer as it will define the format that Predictions will take once we defined our network We'll compile it that means it transforms a simple sequence of layers into a highly efficient series of matrix transforms Intended to be executed on a GPU or CPU depending on our configuration setting it's a pre compute step for the network It's required after defining a model compilation requires a number of parameters to be specified specifically tailored to training our network the Optimization algorithm we use to Train the network and the loss function used to evaluate it are things that we can decide This is the art of deep learning once the network is compiled it can be fit Which means adapting the weights on a training data set fitting the network requires a training data to be specified both a matrix of input patterns X and an array of matching output patterns why the network is trained using the back propagation algorithm and Optimized according to the optimization algorithm and loss function specified when compiling the model finally once we are satisfied With the performance of our fit model we can use it to make predictions on new data This is as easy as calling the predict function on the model with an array of new input patterns For our text generation sample well see that it generates text in the style of our favorite author Just as we fed it in three points to remember there are lots of new competitors that showed up in 2017 for deep learning libraries But Karos is still the easiest way to get started pi torch is getting really popular and is Best way to build models next to Karros and deep networks are a series of math operations in the form of layers just Mix and match them to get different results every time the coding challenge winner from the war robots video is Alberto Garces He used a proximal policy optimization algorithm to train an AI to balance a pendulum using the open AI gym environment Top-notch work, Alberto and the runner up is Sven near der Berger who landed a simulated space X rocket Using PP o such a cool use case this week's coding challenge is to use chaos to build your own deep neural network Github links go in the description and coding challenge winners will be announced next week Please subscribe for more programming videos and for now I've gotta not use anything made by Microsoft, so thanks for watching
Info
Channel: Siraj Raval
Views: 235,813
Rating: 4.7388287 out of 5
Keywords: keras, keras crash course, keras neural network, keras explained, keras tutorial, keras expained siraj, keras 2018, keras tutorial for beginners, keras deep learning, keras example, keras example tensorflow, nips, live coding, ai, software, c++, python, machine learning, deep learning, tanmay bakshi, autoencoder, simulink, research, analysis, ml, computer vision
Id: j_pJmXJwMLA
Channel Id: undefined
Length: 9min 20sec (560 seconds)
Published: Sat Jan 06 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.