Australian police want to use artificial intelligence to act against stalkers on Tinder

0
Australian police want to use artificial intelligence to act against stalkers on Tinder

The New South Wales Police Force (NWSPF) in Australia recently unveiled a proposal to make dating apps safer. Tinder’s parent company, Match Group, has been reluctant to implement the proposal. NWSPF has raised the possibility of using AI to analyze conversations between users and creating a platform for police to access reports of sexual assault on dating apps.

A complicated dialogue between Tinder and the Australian police

NWSPF is currently in dialogue with Match Group regarding the safety measures they wish to put in place. Tinder recognises that it has an important role to play in helping to prevent sexual assault and harassment, both in Australia and on an international level. The dating app is committed to continuing discussions and collaborations around the world with leading organisations dealing with sexual assault cases. For example, Tinder has partnered with the Rape, Abuse & Incest National Network (RAINN). The goal is to make the application safer for its community.

At the same time, NWSPF says it wants to do everything possible to ensure that dating apps – and not just Tinder – cooperate with the police in cases of sexual violence. This follows an investigation that revealed that Tinder did not respond adequately to reports of sexual assault, allowing stalkers and abusers to disappear without a trace.

Artificial intelligence to combat harassment

NWSPF has raised the possibility that dating apps could develop algorithms or other AI systems to monitor users and messages exchanged. The goal would be to analyze the text of messages to determine the behavior of users suspected of sexual assault.

However, Dr Rosalie Gillett, a researcher at Queensland University of Technology, has conducted a study into the safety of female users of dating apps and specifically on Tinder. According to her, the AI is unlikely to detect problematic behaviour. The model would be trained on millions of conversations, but some messages with no clear evidence of harassment or assault are used by stalkers in sexual assault.

Automated systems that might be designed for dating applications might be able to detect overt abuse, such as threats to a person’s physical safety. But they are likely not to identify more normal behaviour that may be similar to that of a stalker, which would then be considered the behaviour of someone who is not to blame.

Translated from La police australienne souhaite utiliser l’intelligence artificielle pour agir contre les harceleurs sur Tinder