<node id="594030">
  <nid>594030</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1501765163</created>
  <changed>1501765163</changed>
  <title><![CDATA[PhD Defense by James Bailey]]></title>
  <body><![CDATA[<p><br />
Title: The Price of Deception in Social Choice<br />
<br />
Time: Friday, August 4th, 2017, 2pm<br />
<br />
Location: Groseclose 402<br />
<br />
Advisor: Dr. Craig Tovey, School of Industrial and Systems Engineering<br />
<br />
Committee:<br />
Dr. Paul Griffin, School of Industrial Engineering, Purdue University<br />
Dr. Santosh Vempala, School of Computer Science, Georgia Institute<br />
of Technology<br />
Dr. Vijay Vazirani, School of Computer Science, Georgia Institute<br />
of Technology<br />
Dr. Prasad Tetali, School of Mathematics, Georgia Institute of Technology<br />
<br />
Reader: Dr. J&ouml;rg Rothe, Institute for Computer Science, University of<br />
D&uuml;sseldorf<br />
<br />
The thesis is available for public inspection in the School of Mathematics<br />
lounge (Skiles 236), the ARC lounge (Klaus 2222), the ISyE PhD student<br />
lounge and at the URL<br />
<br />
<a href="http://aco.gatech.edu/events/final-doctoral-examination-and-" target="_blank">http://aco.gatech.edu/events/final-doctoral-examination-and-</a><br />
defense-dissertation-james-bailey<br />
<br />
Summary:<br />
<br />
Most social choice algorithms rely on private data from individuals to<br />
maximize a social utility. However, many algorithms are manipulable &ndash; an<br />
individual can manipulate her reported data to obtain an outcome she<br />
prefers often at the cost of social good. Literature addresses this by<br />
declaring an algorithm as &ldquo;manipulable&rdquo; or &ldquo;strategy-proof&rdquo;. However, for<br />
many decision we are forced to either use a manipulable algorithm or an<br />
algorithm with negative properties; for instance, a dictatorship is the<br />
only reasonably strategy-proof way to decide an election. Thus, we view it<br />
as unwise to take an all-or-nothing approach to manipulation since we often<br />
have to settle for nothing.<br />
<br />
In this dissertation, we focus on algorithmic design with strategic<br />
behavior in mind. Specifically we develop the framework to examine the<br />
effect of manipulation on social choice using a game theoretic model. Our<br />
model of human behavior is supported by an extensive amount of experimental<br />
evidence and psychology and economics and allows us to better understand<br />
the likely outcomes of strategic behavior. We introduce a measure of<br />
manipulation &ndash; the Price of Deception &ndash; that quantifies the impact of<br />
manipulation. With the Price of Deception we are able to identify<br />
algorithms are negligibly impacted by manipulation, algorithms where<br />
strategic behavior leads to arbitrarily poor outcomes, and anything<br />
in-between. We prove the power of our model and the Price of Deception by<br />
answering open problems in assignments, facility location, election and<br />
stable marriage including a 28-year open problem by Gusfield and Irving.<br />
Our results demonstrate that the Price of Deception, like computational<br />
complexity, provides finer distinctions of manipulability than between<br />
&ldquo;yes&rdquo; and &ldquo;no&rdquo;.</p>

<p>&nbsp;</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[The Price of Deception in Social Choice]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2017-08-04T15:00:00-04:00]]></value>
      <value2><![CDATA[2017-08-04T17:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
