{"681482":{"#nid":"681482","#data":{"type":"event","title":"PhD Defense by Jacob Logas","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle:\u0026nbsp;\u003C\/strong\u003ECasualizing Privacy: Bridging the Gap Between Anti-Facial Recognition Obfuscation Design and User Acceptance\u003C\/p\u003E\u003Cp\u003E\u003Cbr\u003E\u003Cbr\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EDate:\u003C\/strong\u003E\u0026nbsp;Friday, April 11th, 2025\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ETime:\u003C\/strong\u003E 10:00 AM (Eastern Time)\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ELocation:\u003C\/strong\u003E\u0026nbsp;GVU Cafe\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EZoom Link:\u003C\/strong\u003E\u0026nbsp;\u003Ca href=\u0022https:\/\/gatech.zoom.us\/j\/91758844496?pwd=G6wpBLbaiWBeBUGrJaY9BinamYjjzd.1\u0022\u003Ehttps:\/\/gatech.zoom.us\/j\/91758844496?pwd=G6wpBLbaiWBeBUGrJaY9BinamYjjzd.1\u003C\/a\u003E\u0026nbsp;\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EJacob Logas\u003C\/strong\u003E\u003Cbr\u003EPh.D. Candidate\u003Cbr\u003ESchool of Interactive Computing\u003Cbr\u003EGeorgia Institute of Technology\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ECommittee\u003C\/strong\u003E\u003Cbr\u003EDr. Rosa I. Arriaga (Advisor) - School of Interactive Computing, Georgia Institute of\u0026nbsp;Technology\u003C\/p\u003E\u003Cp\u003EDr. Sauvik Das (Advisor) - Human-Computer Interaction Institute, Carnegie Mellon University\u003C\/p\u003E\u003Cp\u003EDr. Thad Starner - School of Interactive Computing, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Duen Horng (Polo) Chau - School of Computational Science and Engineering, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Annie Ant\u00f3n - School of Interactive Computing, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Kelly Caine - School of Computing (Human-Centered Computing), Clemson University\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EAbstract\u003C\/strong\u003E:\u003C\/p\u003E\u003Cp\u003EAdvancements in computer vision have enabled effortless and ubiquitous surveillance through facial recognition, posing challenges for online users trying to maintain anonymity as their faces index their activities across social networks. Despite technically effective anti-facial recognition obfuscation methods, their low adoption rates can be attributed to insufficient attention to human factors in design. My dissertation addresses this gap with a mixed-methods approach that prioritizes user perspectives. In the first study, I design a novel obfuscation method that empowers users to avoid data collection by centralized online social networks (cOSNs). This proof of concept, evaluated with 19 marginalized users, reveals that acceptable obfuscations must support users\u2019 public presentation, even to those they are obfuscating against. In the second study, I recruit 15 users to qualitatively assesses user-facing Adversarial Machine Learning (AML) obfuscation to disrupt facial recognition pipelines while optimizing for imperceptibility. I find that these systems often produce undesirable outputs when visible or are distrusted due to a lack of perceptual assurances. I synthesize these qualitative findings into a brief, validated psychometric scale\u2014the SAIA-8\u2014to ease integrating user perspectives into obfuscation evaluation. In study 3, I build on insights from the prior two studies to develop AML-based obfuscations that might better align with a user\u0027s preferred presentation. I implement a set of 40 unique obfuscations that each emulate an existing aesthetic image filter. These novel obfuscations maintain adversarial effectiveness against facial recognition and outperform prior work on the SAIA-8 across 630 participants. Finally, I present AuraMask, an extensible development pipeline, for rapid prototyping and evaluation of anti-facial recognition obfuscation design to accelerate future research in this space\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003ECasualizing Privacy: Bridging the Gap Between Anti-Facial Recognition Obfuscation Design and User Acceptance\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Casualizing Privacy: Bridging the Gap Between Anti-Facial Recognition Obfuscation Design and User Acceptance"}],"uid":"27707","created_gmt":"2025-03-31 19:19:17","changed_gmt":"2025-03-31 19:19:42","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2025-04-11T10:00:00-04:00","event_time_end":"2025-04-11T12:00:00-04:00","event_time_end_last":"2025-04-11T12:00:00-04:00","gmt_time_start":"2025-04-11 14:00:00","gmt_time_end":"2025-04-11 16:00:00","gmt_time_end_last":"2025-04-11 16:00:00","rrule":null,"timezone":"America\/New_York"},"location":"GVU Cafe","extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}