<node id="689593">
  <nid>689593</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1775763546</created>
  <changed>1775763597</changed>
  <title><![CDATA[PhD Defense by Mengqi Lou]]></title>
  <body><![CDATA[<p><strong>Title:</strong>&nbsp;Two Aspects of Statistical Learning in High Dimensions: Iterative Algorithms and Average-case Reductions</p><p>&nbsp;</p><p>Mengqi Lou</p><p>ACO PhD student</p><p>School of Industrial and Systems Engineering</p><p>&nbsp;</p><p><strong>Date:</strong>&nbsp;April 22, 2026<br><strong>Time:</strong>&nbsp;12:00 PM – 2:00 PM (EST)<br><strong>Location:</strong>&nbsp;Groseclose 226, Georgia Tech Campus</p><p><strong>Zoom:</strong> TBA</p><p><strong>Thesis</strong>: <a href="https://nam12.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdrive.google.com%2Ffile%2Fd%2F1dUQuAif01CY-8o2ccTlNoDsVtQHR-Pqx%2Fview%3Fusp%3Dsharing&amp;data=05%7C02%7Ctm186%40gtvault.onmicrosoft.com%7C40d4a955f41443b0ff9108de966d8863%7C482198bbae7b4b258b7a6d7f32faa083%7C1%7C0%7C639113594359289082%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=MbRYdt3n696%2FRztNAwnBcxxZJMwI%2Fk7OFjT9SFp8nJ4%3D&amp;reserved=0" title="Original URL: https://drive.google.com/file/d/1dUQuAif01CY-8o2ccTlNoDsVtQHR-Pqx/view?usp=sharing. Click or tap if you trust this link.">https://drive.google.com/file/d/1dUQuAif01CY-8o2ccTlNoDsVtQHR-Pqx/view?usp=sharing</a></p><p><strong>Committee</strong>:</p><p>Dr. Ashwin Pananjady (Advisor), Schools of Industrial and Systems Engineering &amp; Electrical and Computer Engineering, Georgia Tech<br>Dr. Cheng Mao (Reader), School of Mathematics, Georgia Tech<br>Dr. Will Perkins, School of Computer Science, Georgia Tech<br>Dr. Justin Romberg, School of Electrical and Computer Engineering, Georgia Tech<br>Dr. Guy Bresler, Department of Electrical Engineering and Computer Science , MIT</p><p><strong>Abstract:</strong></p><p>The task of learning the underlying parameters of a statistical model from noisy samples is ubiquitous in modern signal processing and data science. Both computational and statistical challenges arise, especially in high-dimensional settings where the number of parameters is comparable to (or exceeds) the sample size. On the computational side, iterative algorithms are commonly used to fit complex models to random data, but their design and analysis are often guided by worst-case upper bounds that may not reflect practical performance. On the statistical side, classical information-theoretic limits on sample complexity or signal-to-noise ratio may be unattainable by any polynomial-time procedure, making these limits an impractical benchmark for modern high-dimensional problems.<br><br>In this thesis, we discuss two general frameworks that address these computational and statistical challenges. In the first part, I will present a toolkit that yields sharp, iterate-by-iterate characterizations of solution quality for complex iterative algorithms on several non-convex model-fitting problems with random data. In the second part, I will present a toolkit to derive average-case “reductions’’ between different statistical models, illustrating how such reductions reveal the computational limits of several structured high-dimensional problems.</p><p>&nbsp;</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Two Aspects of Statistical Learning in High Dimensions: Iterative Algorithms and Average-case Reductions]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Two Aspects of Statistical Learning in High Dimensions: Iterative Algorithms and Average-case Reductions</p>]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2026-04-22T12:00:00-04:00]]></value>
      <value2><![CDATA[2026-04-22T14:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Groseclose 226, Georgia Tech Campus]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
