<node id="681135">
  <nid>681135</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1741809253</created>
  <changed>1741809289</changed>
  <title><![CDATA[PhD Defense by Shruthi Hiremath]]></title>
  <body><![CDATA[<p><strong>Title:&nbsp;</strong>Deriving Bespoke Human Activity Recognition Systems for Smart Homes</p><p><strong>Date:</strong>&nbsp;Wednesday, March 26, 2025.</p><p><strong>Time:&nbsp;</strong>12 PM&nbsp;– 2 PM EDT.</p><p><strong>Location:</strong>&nbsp;&nbsp;C1215 Midtown, CODA and <a href="https://gatech.zoom.us/j/9317547921?pwd=c3hEcis0QjJDUTE1US9FclpJTjlrQT09">Zoom</a>&nbsp;meeting (ID: 931 754 7921&nbsp; Passcode: 042839)</p><p>&nbsp;</p><p><strong>Shruthi K. Hiremath</strong><br>Ph.D. Candidate in Computer Science<br>School of Interactive Computing<br>Georgia Institute of Technology</p><p>&nbsp;</p><p><strong>Committee:</strong></p><p>-----------------</p><p>Dr. Thomas Ploetz (Advisor), School of Interactive Computing, Georgia Institute of Technology<br>Dr. Gregory Abowd, School of Interactive Computing, Georgia Institute of Technology<br>Dr. Sonia Chernova, School of Interactive Computing, Georgia Institute of Technology<br>Dr. Diane Cook, School of Electrical Engineering and Computer Science, Washington State University<br>Dr. Uichin Lee, School of Electrical and Computer Engineering, Korea Advanced Institute of Science and Technology</p><p>&nbsp;</p><p><strong>Abstract:</strong></p><p>-----------------</p><p>Smart Homes have come a long way: From research laboratories in the early days, through periods of (almost) neglect, to their recent revival in real-world environments enabled by the existence of commodity devices and robust, standardized software frameworks. With such availability, human activity recognition (HAR) in smart homes has become attractive for many real-world applications, especially in the domain of Ambient Assisted Living (AAL).</p><p>&nbsp;</p><p>Yet, building an activity recognition system for specific smart homes, which are specialized spaces with varying home layouts and inhabited by individuals with idiosyncratic behaviors and habits, is a non-trivial endeavor. For real-world deployments, privacy and logistical concerns essentially rule out the possibility of third parties being able to collect the much-needed annotated sensor data while the resident already lives in their smart home.</p><p>&nbsp;</p><p>I address the challenges of developing a Human Activity Recognition (HAR) system for smart homes by defining its lifespan of three phases: <em>i)</em>&nbsp;bootstrapping,<em>&nbsp;ii)</em>&nbsp;updating and extending, and<em>&nbsp;iii)&nbsp;</em>expanding recognition capabilities for complex tasks. In the bootstrapping phase, I establish a system that quickly recognizes prominent activities with minimal resident involvement. The second phase introduces an update and extension procedure to improve segmentation accuracy for the activities previously identified. Finally, I enhance the system's capabilities by incorporating large language models (LLMs) as contextual knowledge bases. LLMs help encode contextual information through language-based descriptions and identify structural constructs of complex activity sequences, aimed at&nbsp; improving recognition and monitoring changes in routine patterns.</p><p>&nbsp;</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Deriving Bespoke Human Activity Recognition Systems for Smart Homes]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Deriving Bespoke Human Activity Recognition Systems for Smart Homes</p>]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2025-03-26T12:00:00-04:00]]></value>
      <value2><![CDATA[2025-03-26T14:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[C1215 Midtown, CODA and Zoom meeting (ID: 931 754 7921  Passcode: 042839)]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
