Title:
Lecture 4: Mathematics for Deep Neural Networks: Statistical theory for deep ReLU networks

Abstract
We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient approximation of polynomials. Based on this, we show how risk bounds can be obtained for sparsely connected networks.
Sponsor
Date Issued
2019-03-15
Extent
58:39 minutes
Resource Type
Moving Image
Resource Subtype
Lecture
Rights Statement
Rights URI