617952 event 1550353859 1554048132 <![CDATA[TRIAD Lecture Series by Professor Johannes Schmidt-Hieber (5/5)]]> It's a great pleasure to announce that Professor A.J. Schmidt-Hieber will visit us and deliver a series of lectures on modeling of neural networks. All lectures will be from 10:30 am to 11:30 am on the following dates: 
1.    Wednesday, March 6, 2019
2.    Friday, March 8, 2019
3.    Wednesday, March 13, 2019
4.    Friday, March 15, 2019
5.    Monday, March 18, 2019
All lectures will be in Groseclose 402. The following are the topics of the above lectures. The lectures are open to the public, and no RSVP is needed. 

Lecture 1) Survey on neural network structures and deep learning
There are many different types of neural networks that differ in complexity and the data types that can be processed. This lecture provides an overview and surveys the algorithms used to fit deep networks to data. We discuss different ideas that underly the existing approaches for a mathematical theory of deep networks.
Lecture 2) Theory for shallow networks 
We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation rates for shallow networks is given. It is shown how this leads to estimation rates. In the lecture, we also discuss methods that fit shallow networks to data.
Lecture 3) Advantages of additional layers
Why are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep networks and finally discuss the Kolmogorov-Arnold representation theorem. 
Lecture 4) Statistical theory for deep ReLU networks
We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient approximation of polynomials. Based on this, we show how risk bounds can be obtained for sparsely connected networks. 
Lecture 5) Energy landscape and open problems
To derive a theory for gradient descent methods, it is important to have some understanding of the energy landscape. In this lecture, an overview of existing results is given. The second part of the lecture is devoted to future challenges in the field. We describe important future steps needed for the future development of the statistical theory of deep networks.

Video link: https://smartech.gatech.edu/handle/1853/60958 

]]> Lecture 5) Energy landscape and open problems
To derive a theory for gradient descent methods, it is important to have some understanding of the energy landscape. In this lecture, an overview of existing results is given. The second part of the lecture is devoted to future challenges in the field. We describe important future steps needed for the future development of the statistical theory of deep networks.
 

]]>
huo@gatech.edu

]]>
<![CDATA[Groseclose Building]]> 602673 1795 109581