<node id="507461">
  <nid>507461</nid>
  <type>event</type>
  <uid>
    <user id="27690"><![CDATA[27690]]></user>
  </uid>
  <created>1456757236</created>
  <changed>1475893005</changed>
  <title><![CDATA[PhD Dissertation Defense by Arridhana Ciptadi]]></title>
  <body><![CDATA[<p>COMMITTEE:</p><p>&nbsp;</p><p>Dr. James M. Rehg, School of Interactive Computing, Georgia Tech (Advisor) Dr. Gregory D. Abowd, School of Interactive Computing, Georgia Tech</p><p>(co-Advisor)</p><p>Dr. Agata Rozga, School of Interactive Computing, Georgia Tech Dr. Daniel Messinger, Department of Psychology, University of Miami Dr. Pietro Perona, Division of Engineering and Applied Science, California Institute of Technology</p><p>&nbsp;</p><p>ABSTRACT</p><p>&nbsp;</p><p>The goal of this thesis is to develop a set of tools for continuous tracking of behavioral phenomena in videos to support human behavior study. Current standard practices for extracting useful behavioral information from a video are typically difficult to replicate and require a lot of human time. For example, extensive training is typically required for a human coder to reliably code a particular behavior/interaction. Also, manual coding typically takes a lot more time than the actual length of the video (e.g., it can take up to 6 times the actual length of the video to do human-assisted single object tracking). The time intensive nature of this process (due to the need to train expert and manual coding) puts a strong burden on the research process. In fact, it is not uncommon for an institution that heavily uses videos for behavioral research to have a massive backlog of unprocessed video data.</p><p>&nbsp;</p><p>To address this issue, I have developed an efficient behavior retrieval and interactive tracking system. These tools allow behavioral researchers/clinicians to more easily extract relevant behavioral information, and more objectively analyze behavioral data from videos. I have demonstrated that my behavior retrieval system achieves state-of-the-art performance for retrieving stereotypical behaviors of individuals with autism in a real-world video data captured in a classroom setting. I have also demonstrated that my interactive tracking system is able to produce high-precision tracking results with less human effort compared to the state-of-the-art. I further show that by leveraging the tracking results, we can extract an objective measure based on proximity between people that is useful for analyzing certain social interactions. I validated this new measure by showing that we can use it to predict qualitative expert ratings in the Strange Situation (a procedure for studying infant attachment security), a quantity that is difficult to obtain due to the difficulty in training the human expert.</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Interactive Tracking and Action Retrieval to Support Human Behavior Analysis]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2016-03-14T17:00:00-04:00]]></value>
      <value2><![CDATA[2016-03-14T19:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>119191</tid>
        <value><![CDATA[PhD Dissertation Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
