A collaborative approach to privacy and safety
RAID caught up with Andy Phippen, Professor of IT Ethics and Digital Rights at Bournemouth University and co-founder of the High-Level Working Group for Privacy and Safety
The Working Group is trying to move the conversation around privacy and safety on – for folk like myself who have been researching the area for twenty years, particularly related to children and young people, we see a constant cycling of the same prohibitive ideas, a lack of youth voice and a belief that technology can solve these issues.
We want to, at least, put work into the public domain that shows agreement across a range of stakeholders that this approach does not work, and, as with many complex social problems, a mix of education, tools and law are more successful. We hope, of course, to go further than this to change thinking across the space to move from prohibitive approaches to those that adopt more of a harm reduction philosophy.
Privacy and safety are both expectations and rights, and one should not be sacrificed to justify the other. Speaking from the perspective of young people, yes of course they should be able to use online services with a mitigation of risk of harm, but that shouldn’t arise from constant monitoring or other “safety tech” such as tracking so that adults can “make sure they are safe”.
Young people have a right to privacy; most countries have signed up to the UN CRC which underpins this; and, in some cases, that privacy is from those who claim to wish to safeguard them.
The key challenge arises from the poorly defined needs (by policymakers) around safety – do we mean free from harm, or do we mean reducing harm? Prohibitive approaches are doomed to fail; can we seriously expect a platform to guarantee freedom from harm when most harms are caused by other people? However, if we adopt more of a harm reduction approach, which equips young people with the knowledge of both how to protect their online identities and resources (the privacy dimension) but also understand the risks and how to mitigate them (the safety dimension) they are in a far better place to be able to assure both safety and privacy.
It’s essential. I always refer to StopNCII, a project between Meta and SWGfL Trust, as one of the best examples of industry and civil society working together, using technology that works to empower victims of image-based abuse. With this solution we have both industry and civil society working to their strengths, rather than pointing the finger at each other saying “do more”!
Event like this are important because it is crucial stakeholders are in the room together talking, and listening. One of the most powerful outcomes of the Working Group to date has been that stakeholders are talking to each other and getting a better understand of each other’s position (while also realising we’re all actually wanting the same outcomes!), rather than working in isolation in conflict with each other.