<node id="415031">
  <nid>415031</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1434444964</created>
  <changed>1475892734</changed>
  <title><![CDATA[Ph.D. Proposal by Arridhana Ciptadi]]></title>
  <body><![CDATA[<p>Ph.D. Thesis Proposal Announcement</p><p>&nbsp;</p><p>Title: Interactive Tracking and Action Retrieval to Support Human Behavior Analysis</p><p>&nbsp;</p><p>&nbsp;</p><p>Arridhana Ciptadi</p><p>Ph.D. Student</p><p>School of Interactive Computing</p><p>College of Computing</p><p>Georgia Institute of Technology</p><p>&nbsp;</p><p>Date: Tuesday, June 16th, 2015</p><p>Time: 2PM to 4PM EST</p><p>Location: Klaus 1212</p><p>Committee:</p><p>Dr. James M. Rehg, School of Interactive Computing, Georgia Tech (co-Advisor)</p><p>Dr. Gregory D. Abowd, School of Interactive Computing, Georgia Tech (co-Advisor)</p><p>Dr. Agata Rozga, School of Interactive Computing, Georgia Tech</p><p>Dr. Daniel Messinger, Department of Psychology, University of Miami</p><p>Dr. Pietro Perona, Division of Engineering and Applied Science, </p><p>California Institute of Technology</p><p>&nbsp;</p><p>Abstract:</p><p>&nbsp;</p><p>The goal of this thesis is to develop a set of tools for continuous </p><p>tracking of behavioral phenomena in videos to support human behavior </p><p>study. Current standard practices for extracting useful behavioral </p><p>information from a video are typically difficult to replicate and </p><p>require a lot of human time. For example, extensive training is </p><p>typically required for a human coder to reliably code a particular </p><p>behavior/interaction. Also, manual coding typically takes a lot more </p><p>time than the actual length of the video. The time intensive nature of </p><p>this process puts a strong limitation on the scalability of the study. </p><p>Furthermore, this makes it difficult for a third party to perform </p><p>replication study.</p><p>&nbsp;</p><p>To address this issue, I have developed an efficient interactive </p><p>tracking and behavior retrieval system. These tools allow behavioral </p><p>researchers/clinicians to more easily extract relevant behavioral </p><p>information, and more objectively analyze behavioral data from videos. I </p><p>have demonstrated that my behavior retrieval system achieve </p><p>state-of-the-art performance in a preliminary experiment. In the </p><p>proposed work, I will perform a more thorough evaluation of this system </p><p>on a wider set of behaviors. I have also demonstrated that my </p><p>interactive tracking system is able to produce high-precision tracking </p><p>results with less human effort compared to the state-of-the-art. In the </p><p>proposed work, I will show how a proximity measure that is derived from </p><p>the results of interactive tracking can be used to predict attachment </p><p>classification in The Strange Situation, a protocol for studying infant </p><p>attachment security. In addition, I will also show how this measure </p><p>opens a new avenue of behavioral research by showing that it can be used </p><p>to quantify the consistency of infant behavior in the two reunion </p><p>episodes in Strange Situation.</p><p> </p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Interactive Tracking and Action Retrieval to Support Human Behavior Analysis]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2015-06-16T15:00:00-04:00]]></value>
      <value2><![CDATA[2015-06-16T17:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>208</tid>
        <value><![CDATA[computing]]></value>
      </item>
          <item>
        <tid>1808</tid>
        <value><![CDATA[graduate students]]></value>
      </item>
          <item>
        <tid>913</tid>
        <value><![CDATA[PhD]]></value>
      </item>
          <item>
        <tid>3395</tid>
        <value><![CDATA[proposal]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
