<node id="617958">
  <nid>617958</nid>
  <type>event</type>
  <uid>
    <user id="34963"><![CDATA[34963]]></user>
  </uid>
  <created>1550356374</created>
  <changed>1553373573</changed>
  <title><![CDATA[TRIAD Lecture Series by Professor Johannes Schmidt-Hieber (2/5)]]></title>
  <body><![CDATA[<p>It&#39;s a great pleasure to announce that Professor A.J. Schmidt-Hieber will visit us and deliver a series of lectures on modeling of neural networks. All lectures will be from 10:30 am to 11:30 am on the following dates:&nbsp;<br />
1.&nbsp;&nbsp; &nbsp;Wednesday, March 6, 2019<br />
2.&nbsp;&nbsp; &nbsp;Friday, March 8, 2019<br />
3.&nbsp;&nbsp; &nbsp;Wednesday, March 13, 2019<br />
4.&nbsp;&nbsp; &nbsp;Friday, March 15, 2019<br />
5.&nbsp;&nbsp; &nbsp;Monday, March 18, 2019<br />
All lectures will be in Groseclose 402. The following are the topics of the above lectures. The lectures are open to the public, and no RSVP is needed.&nbsp;<br />
Lecture 1) Survey on neural network structures and deep learning<br />
There are many different types of neural networks that differ in complexity and the data types that can be processed. This lecture provides an overview and surveys the algorithms used to fit deep networks to data. We discuss different ideas that underly the existing approaches for a mathematical theory of deep networks.<br />
Lecture 2) Theory for shallow networks&nbsp;<br />
We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation rates for shallow networks is given. It is shown how this leads to estimation rates. In the lecture, we also discuss methods that fit shallow networks to data.<br />
Lecture 3) Advantages of additional layers<br />
Why are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep networks and finally discuss the Kolmogorov-Arnold representation theorem.&nbsp;<br />
Lecture 4) Statistical theory for deep ReLU networks<br />
We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient approximation of polynomials. Based on this, we show how risk bounds can be obtained for sparsely connected networks.&nbsp;<br />
Lecture 5) Energy landscape and open problems<br />
To derive a theory for gradient descent methods, it is important to have some understanding of the energy landscape. In this lecture, an overview of existing results is given. The second part of the lecture is devoted to future challenges in the field. We describe important future steps needed for the future development of the statistical theory of deep networks.</p>

<p>&nbsp;</p>

<p>See the video at https://smartech.gatech.edu/handle/1853/60935<br />
&nbsp;</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Lecture 2) Theory for shallow networks ]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Lecture 2) Theory for shallow networks&nbsp;<br />
We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation rates for shallow networks is given. It is shown how this leads to estimation rates. In the lecture, we also discuss methods that fit shallow networks to data.<br />
&nbsp;</p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2019-03-08T10:30:00-05:00]]></value>
      <value2><![CDATA[2019-03-08T11:30:00-05:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Faculty/Staff]]></value>
      </item>
          <item>
        <value><![CDATA[Postdoc]]></value>
      </item>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
          <item>
        <value><![CDATA[Graduate students]]></value>
      </item>
          <item>
        <value><![CDATA[Undergraduate students]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[<p>huo@gatech.edu</p>
]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[https://isye.gatech.edu/about/maps-directions/isye-building-complex]]></url>
      <title><![CDATA[Groseclose Building]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>602673</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[TRIAD ]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1795</tid>
        <value><![CDATA[Seminar/Lecture/Colloquium]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>109581</tid>
        <value><![CDATA[deep learning]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
