AYA

a voice assistant pushing back on sexual harassment



> In the design exploration “AYA pushes back”, I explored different ways a voice assistant may push back on sexual harassment.

> As part of my research into sexual harassment in Japan, I facilitated a workshop where Japanese students were asked to design an AI to help out in situations of harassment.

> Personal notes on ways AYA could push back.

︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎︎

Keywords: Design exploration, voice assitant, sexual harassment

Category: Design exploration
Location: Kyoto, Japan
University: Kyoto Institute of Technology
Collaborative partners: Tanpopo-no-ye, Vestjysk Gymnasium Tarm, KYOTO D-lab
Collaborator: Trieuvy Luu
Supervision: Prof. Julia Cassim
In video: me
Year: 2017
Did you know that a good chuck of questions for personal assistants like Siri, Alexa or Cortana is not about the weather, your next meeting or the nearest restaurant, but questions or comments of sexual or violent nature?

We are talking with more and more computers, and we are just in the beginning of learning how to do this. In a moment like this, and with global attention to sexual harassment in the wake of #MeToo, it is important to turn our attention to how personal assistants respond to sexual harassment.

Together with Trieuvy Luu, I did an experiment to explore how a personal assistant could push back.

Watch the video to meet AYA!

Published in:


Marie Louise Juul Søndergaard and Lone Koefoed Hansen. 2018. Intimate Futures: Staying with the Trouble of Digital Personal Assistants through Design Fiction. In Proc. DIS 2018. ACM (2018)

Marie Louise Juul Søndergaard. (submitted, 2018). AYA pushes back. In Wilful Technologies. A publishing experiment on feminist + technologies + design. Edited by Madeline Balaam & Lone Koefoed Hansen

Mark