Considering Cybersecurity When Social Circles Share Digital Resources
Everyone who has ever shared an apartment in college has experienced some form of this challenge: You move in with a group of three other people. You think you set proper boundaries and rules for the house, but everyone swears they’re laid back and not too worried about any of the others getting out of line. After a while, everyone settles in, and next thing you know one roommate is taking a little too much advantage of your goodwill.
Their boyfriend starts showing up at the apartment all the time, whether you know he is coming or not. Sometimes, he’s even there when your roommate isn’t. He has a key to the front door, and helps himself to what’s in the fridge. All of a sudden, you essentially have a new roommate that you never approved and don’t know a whole lot about.
Sure, it’s annoying, but consider some of the bigger implications: Shared resources are becoming vulnerable to people outside of the intended group of users. Privacy is being compromised by one member of the group who is practicing poor security behavior. There is a disconnect amongst members of the group about what should and should not be acceptable. As the world becomes more computing intensive, this type of behavior has even larger implications in the context of technology.
New research by members of Georgia Tech’s School of Interactive Computing has found that existing technical controls for shared digital resources fall short in facilitating collaborative governance and decision making. The paper, which was accepted and awarded a Best Paper Honorable Mention at the 2020 CHI Conference on Human Factors in Computing Systems, examines why they fall short and offers guidance on ways to improve them.
“The idea here is that as computing infiltrates more and more of our social lives, we are lacking in ability for groups of people to come together and collaboratively think about the access control policies, the threat models we share, and decide together,” said Sauvik Das, an assistant professor in the school and the lead on the research. “What’s a fair way to control access to this resource?”
To examine these factors, Das and his team – which includes co-authors Hue Watson, Eyitemi Moju-Igbene, and Akanksha Kumari – spoke with a number of types of groups about their behaviors and processes. Roommates, long-term friends, work colleagues – any group of 3-5 people that is socially connected for about 3-6 months. They looked at the resources that they collectively own and share that they would rather not have people outside of the group have access to – a smart fridge, a conference room, or a Google Doc, for example – and asked the threat models they each had.
“What are they afraid of?” Das said. “Who might they be against having access? How do they jointly come up with strategies? What conversations do they have about security and privacy? How does that impact their behaviors inside and outside of the group setting?”
They also had subjects fill out a diary to probe deeper individually – has something come up about security and privacy since they spoke as a group? What are the pain points and emergent threats on a day-to-day basis?
The research uncovered that the existing controls for digital resources drop the ball.
“They’re all designed under the understanding that people either don’t care enough or that only one person should handle the privacy,” Das said. “Most of these strategies were socially construed, but implicit because technical controls didn’t allow them to do it differently.”
One conclusion Das reached is that there need to be tools of shared governance, a tangible way for each member of the group to engage constructively with the shared privacy structure.
“What it looks like now is sort of a dictatorship,” Das explained. “Or, it can be egalitarian where anybody can adjust anything, and that causes problems too. There’s nuance of how different groups would like to control different resources. People are concerned about outsider threats, the types of information you share with people within your group, and with the reliability of people within the group to practice good behavior.
“We need a simple, fun, and easy way for people to come together and approach security and privacy as a group.”