<node id="393301">
  <nid>393301</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1427975193</created>
  <changed>1475892691</changed>
  <title><![CDATA[Ph.D. Proposal Defense by Daniel Kohlsdorf]]></title>
  <body><![CDATA[Title: <strong>Data Mining in Large Audio Collections of Dolphin Signals</strong><strong><br /></strong><strong><br /></strong><strong>Daniel Kohlsdorf</strong>Ph.D. StudentSchool of Interactive ComputingCollege of ComputingGeorgia Institute of Technology<br /><br /><strong>Date: Thursday, April 9th, 2015</strong><strong>Time: 10 AM to 12 Noon EST</strong><strong>Location: TSRB 223</strong><br /><strong>Committee:</strong>Dr. Thad Starner, School of Interactive Computing, Georgia TechDr. Irfan Essa, School of Interactive Computing, Georgia TechDr. Charles Isbell, School of Interactive Computing, Georgia TechDr. Michael Beetz, University BremenDr. Denise Herzing, Wild Dolphin Project<strong><br /></strong><p><strong>Abstract:</strong></p><p>"The presented research addresses signal processing and data mining in large collections of audible dolphin communication.</p><p>The goal is to develop a novel system that is capable of automatically finding patterns and their correspondences to dolphin behavior.</p><p>The system will help marine biologists to perform communication analysis automatically.</p><p>Biologists can interactively find and test novel hypothesis using a user interface on top of the data mining system.</p><p>Current research in animal communication research suffers from the slow speed of manual data analysis.</p><p>Often researchers search and annotate audio and video material using manual measurements. Often these measurements are subjective and not formally defined.</p><p>Finding patterns of communication that relate to observable behavior without metrics for comparison is a tedious process.</p><p>The process can take several years from data collection to publication.</p><p>Therefore, I propose a data mining system for audible animal communication.</p><p>The system automatically learns a representation in which animal communication patterns can be easily found and compared.</p><p>Furthermore, the system will be able to find communication patterns and segmentations using</p><p>algorithms adopted from speech recognition and time series motif discovery.</p><p>If integrated in a user interface,</p><p>an algorithm that just segments and finds patterns in animal communication can already</p><p>help researchers in their research effort. However, the proposed system is capable</p><p>of testing hypothesis about animal communication in addition.</p><p>For example, a researcher might ask if different groups of animals use different patterns.</p><p>In this case the researchers could collect communication data from multiple groups,</p><p>discover all patterns in the communication jointly and then compare the statistical distributions for</p><p>each group against the others.</p><p>In this way, researchers can not only visually inspect the resulting patterns but also gain a novel, quantitive analysis method.</p><p>I hypothesize that feature learning and automatic segmentation of audible dolphin communication along with statistical</p><p>communication models can provide valuable insight into dolphin behavior useful to marine biologists</p><p>for retrospective analysis as well as scientific hypothesis generation and testing."</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Data Mining in Large Audio Collections of Dolphin Signals]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2015-04-09T11:00:00-04:00]]></value>
      <value2><![CDATA[2015-04-09T13:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>11038</tid>
        <value><![CDATA[CoC PhD Thesis Proposal Announcement]]></value>
      </item>
          <item>
        <tid>1808</tid>
        <value><![CDATA[graduate students]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
