the Samaritans Radar app
First off I should add a trigger warning, this post will contain some discussion about suicide and mental health issues I’ve experienced directly – I’m going to be blunt and open about some stuff, and if you could find that uncomfortable or a trigger in some way, please don’t read on.
I feel I need to share the context to my comments below, as they form a big part of my reaction.
It’s taken me a few years to be able to write something like this: when I was 18 my dad took his own life – if you really want to know more about that, how it shaped me and influences who I am today, that was the subject of my first TEDx talk – video and link to slides.
I know one of the effects that had on me is a deep desire to try and help as many people as I can, at times definitely a curse but one I’m OK with! Seeing the Samaritans radar app at first I felt “wow, this is awesome”, it appeals to me both on a clever use of tech way AND a deep desire to never miss those signs again – selfish maybe, but I’m being honest here.
It felt like something I could easily do, that might help someone – something we could all easily do to look after one another and so at first I didn’t think about it with my normal analytical hat on.
But then I started to see some troubling posts and tweets, from the very people it set out to help – some concerns raised about data protection and privacy issues (by far the Information rights and wrongs post is the best I’ve seen) and it made me put that hat on.
The criticism comes in 3 main flavours:
1) Data protection concerns – I find it hard to reconcile this one, I can’t believe they did this without thoroughly consulting with DPA specialist lawyers… but from my understanding (and echo’d by others) it breaches some simple principles: they are acting as a data processor using public data, there is a higher threshold set for processing that involves subjects who may suffer from mental health issues that i don;t think it answers and the largest in my opinion: that there is no opt-in (or opt-out)
2) It enables stalkers to prey on vulnerable people – this is the one I think has the least merit. I’ve personally had two quite serious stalking incidents where I was close to getting a restraining order, I don’t see how this offers anything extra that a committed online stalker wouldn’t already know.
3) “SamaritansRadar is likely to leave me feeling worse. I’ll keep quieter and feel more isolated or end up bullied and victimised.” - this is one of many examples, where people have said they feel threatened by it’s very existence, for me this is the show stopper. Take a look at the hashtag, this is NOT an isolated opinion.
And that’s the killer blow for me, the sheer volume of these comments makes me feel it should be pulled.
The takeaway I take from all this, often said during “big data” discussions, is to have an outside (really, really outside) impartial person to discuss the ethical implications of the use of publicly available data .
Taking a step back, I can see how this could happen to me – when you explain a project to people, you bring your own filter to how you explain it – for this kind of project it’s so important to test it with users, without that filter.
So what now? Well, I think adapting this as a tool for forum administrators could be a winning idea I’ve seen suggested. What would you do next?