Tutorial : Mathematics of Deep Learning - Part 1
Video Statistics and Information
Channel: ComputerVisionFoundation Videos
Views: 17,990
Rating: undefined out of 5
Keywords: CVPR17, Deep Learning, Tutorial
Id: Mdp9uC3gXUU
Channel Id: undefined
Length: 69min 49sec (4189 seconds)
Published: Wed Aug 23 2017
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
@6:40 He's confusing dropout with dropconnect.
Throughout most (all?) of the presentation, X refers to the parameters of the neural network.
At 25:20, he states that, as an assumption, l(Y, X) is convex in X, but says that X "is the output of the network," even though Phi on the same slide represents the outputs. Meanwhile, if X is actually the network parameters, then this assumption doesn't hold.
At 37:53, he even writes down the objective function with regularization as l(Y, X) + lambda theta(X).
Can anyone explain what's intended here?
Here's Part2, in case anyone's interested.