My Research Focus: Technology Facilitated Domestic Abuse

Technology-facilitated Domestic Abuse (“tech abuse”) is the increasingly common problem of technology being misused to empower domestic abusers and harm victim-survivors of abuse. This covers a wide array of harms including surveillance, harassment, manipulation and gaslighting, and coercive control. Lots of prior research has explored the different forms of tech-abuse, however there are a range of research gaps that I explore in my work.

The first branch of my research concerns improving our understanding of tech-abuse: in particular, I am interested in how tech-abuse is different or experienced differently by minority groups such as the LGBTQ+ population. I have also explored issues such as how common “abusive” behaviours are in the standard population, such as gaining access to a partner’s phone, texts, and online accounts.

My second area of research explores safety mechanisms for victim-survivors escaping abusive relationships. This involves evaluation of technologies such as quick escape buttons and vault apps, and developing tools to aid Police and support sector communication with abuse victims and survivors.

The final area of my research aims to develop better interventions for technology-facilitated domestic abuse. As this is a relatively new research area, there are very few existing mechanisms to prevent abusive behaviours. I am interested in both evaluating systems to design novel interventions for abuse, and testing and improving upon existing interventions such as the Apple-Google anti-stalking features for Airtags and similar devices.

I am currently exploring these research areas through my PhD, however as I start my fourth year I aim to transition into post-doctorate research in the same areas.

Academic Papers

Summary available on Google Scholar

In abusive contexts, many abusers will try and get access to their partner’s phone to exert control over the victim, as well as using the access to spy on them and access other online accounts. Many users in non-abusive relationships also share access to their phone in much healthier contexts, and have different perceptions about how healthy different sharing behaviours are.

In this study, we survey 531 users to understand how many share access to their device and accounts, as well as how healthy or toxic they perceive a variety of sharing behaviours to be. Fundamentally, we examine whether regular users prefer privacy by allowing partners to keep their personal information secret, or transparency by providing access as a way of gaining trust in a relationship. As well as exploring consentual sharing behaviours, we explore betrayals and why they occur. We conclude with discussions of technological design changes that allow better enforcement of boundaries in line with the desired privacy/transparency level users desire.

Privacy or Transparency? Negotiated Smartphone Access as a Signifier of Trust in Romantic Relationships

Periwinkle Doerfler, Kieron Ivy Turk, Chris Geeng, Damon McCoy, Jeffrey Ackerman, Molly Dragiewicz

Published via Arxiv, available here.

"Say I'm in public...I don't want my nudes to pop up." User Threat Models for Using Vault Applications

Chris Geeng, Natalie Chen, Kieron Ivy Turk, Jevan Hutson, Damon McCoy

Presented at : SOUPS Symposium 2024

Published in Proceedings of the Twentieth Symposium on Usable Privacy and Security (SOUPS 2024).

The official publication is available here.

Many users share access to their devices with friends and partners, but may have content that they do not wish others to see. One solution to this are Vault Apps: tools which encrypt and hide media stored on a user’s device, often behind a PIN or other secondary security mechanism. Prior research analysed the security of these applications, but not what users need from them, and which threat models they are protecting against.

We interview 18 users of vault apps to better understand what content is stored in vault apps and against which adversaries users are protecting themselves. We define a set of threat models for vault apps and highlight which are the most relevant, and explore designs which can best protect users against these threat actors.

Personal item tracking devices are popular for locating lost items such as keys, wallets, and suitcases. Originally created to help users find personal items quickly, these devices are now being abused by stalkers and domestic abusers to track their victims' location over time. Some device manufacturers created `anti-stalking features' in response, and later improved on them after criticism that they were insufficient.

We analyse the effectiveness of the anti-stalking features with five brands of tracking devices through a gamified naturalistic quasi-experiment in collaboration with the Assassins' Guild student society. Despite participants knowing they might be tracked, and being incentivised to detect and remove the tracker, the anti-stalking features were not useful and were rarely used. We also identify additional issues with feature availability, usability, and effectiveness. These failures combined imply a need to greatly improve the presence of anti-stalking features to prevent trackers being abused.

This is one of two studies I have published on item tracker stalking; the sister study is “Can’t Keep Them Away”, described below.

Stop Following Me! Evaluating the Effectiveness of Anti-Stalking Features of Personal Item Tracking Devices

Kieron Ivy Turk, Alice Hutchings

Accepted for publication at EuroUSEC 2024.

The Arxiv version of this paper is available here.

Can’t Keep Them Away: The Failures of Anti-stalking Protocols in Personal Item Tracking Devices

Kieron Ivy Turk, Alice Hutchings, Alastair R Beresford

Presented at : Security Protocols Workshop 2023

Published in Lecture Notes in Computer Science, vol 14186 (Security Protocols XXVIII).

T official publication is available here, and the open-access version is available here alongside the transcript of discussion.

A number of technology companies have introduced personal item tracking devices to allow people to locate and keep track of items such as keys and phones. However, these devices are not always used for their intended purpose: they have been used in cases of domestic abuse and stalking to track others without their consent.

In response, manufacturers introduced a range of anti-stalking features designed to detect and mitigate misuse of their products. In this paper, we explore common implementations of these anti-stalking features and analyse their limitations. In other research, we identified that very few people use anti-stalking features, even when they know that someone might be tracking them and are incentivised to evade them.

We additionally identify several failures of the features that prevent them from performing their intended purpose even if they were in use. It is impossible for anti-stalking features to identify the difference between ‘bad’ tracking and ‘good’ tracking. Furthermore, some features work on some types of phones for some types of tracking devices, but not all work on all phones for all trackers. Some anti-stalking features are not enabled by default, and some require manual intervention to scan for devices. We provide suggestions for how these features could be improved, as well as ideas for additional anti-stalking features that could help mitigate the issues discussed in this paper.

This is one of two studies I have published on item tracker stalking; the sister study is “Stop Following Me”, described above.

Accessing online support services can be dangerous for some users, such as domestic abuse survivors. Many support service websites contain “quick exit” buttons that provide an easy way for users to escape the site.

We investigate where exit buttons and other escape mechanisms are currently in use (by country and type of site) and how they are implemented. We analyse both the security and usability of exit mechanisms on 323 mobile and 404 desktop sites. We find exit buttons typically replace the current page with another site, occasionally opening additional tabs. Some exit buttons also remove the page from the browser history.

When analysing the design choices and shortcomings of exit button implementations, common problems include cookie notices covering the buttons, and buttons not remaining on the screen when scrolling. We provide recommendations for designers of support websites who want to add or improve this feature on their website.

Click Here to Exit: An Evaluation of Quick Exit Buttons

Kieron Ivy Turk, Alice Hutchings

Presented at CHI 2023.

Published in the proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany.

The publication is available here.

A tight scrape: methodological approaches to cybercrime research data collection in adversarial environments.

Kieron Turk, Sergio Pastrana, Ben Collier

Presented at WACCO 2020 : Second Workshop on Attackers and Cyber-Crime Operations

Published in the proceedings of IEEE European Symposium on Security and Privacy Workshops (EuroS&PW) 2020.

My blog post presenting this paper is available here, and the official publication in IEEE is available here.

A study of “adversarial scraping” for academic research, in which websites implement assorted defences to prevent web scraping while researchers attempt to bypass these defences to collect data for futher investigation. Inspired by the novel “attacks on the browser” encountered on a small collection of sites, we decided to document the range of features which we have encountered through assorted scraping projects, as well as the countermeasures that can be used to overcome these methods.

We then classify the defences by effectiveness, and find that a large number of the methods used to prevent crawling are inneffective, or of minimal impact. Other systems successfully slow down crawling, inhibiting historical data collection, while a small number of defences are capable of preventing scraping entirely in certain circumstances.

We identify two environments which can be analysed independently: sites hosted on Tor (“onions”) and chat channels. Onions are generally easier to scrape, due to many defences being unusable with the associated privacy restrictions (such as Javascript being near-universally disabled). Chat channels such as those on Discord and Telegram use far fewer technical measures to prevent scraping; instead, many use moderator intervention to identify and block automated accounts that are identified as bots.

University

Educated at the University of Cambridge

I have studied at Cambridge since 2017, completing a series of degrees in Computer Science with a focus on security and cybercrime.

BA Computer Science // 1st with Honours

MEng Computer Science // Honours pass with Distinction

PhD Computer Science // Ongoing