A while back I ventured out to the MLOVE ConFestival at a castle somewhere in Saxony-Anhalt to join my fellow Dutchmen Maarten den Braber and Yuri van Geest to present about the quantified self and our perspective on data and its social consequences as game designers. Any time you use technology to try to influence human behaviour, you have to be especially sensitive to existing power relations and how they will influence usage.
I did a lot of research and I found the discourse on QS to be limited and lacking consideration for the larger contexts (as Saarinen would say) the work will fit into. The term QS is often stretched to the point where it encompasses more or less everything and a flurry of gadgets are being developed for all situations. Within all of this Joshua Kauffman touches on some of the ethical issues but the person whose writing was most valuable was Buster Benson. Somebody whom we have been admiring for a while and whose websites we have used regularly.
MLOVE was a refreshingly laid back event. Taking everybody out of the city and putting them in a castle on the countryside has a very positive effect on the interactions people have with each other. The Q&A we did at the end of our session was also one of the most stimulating and topical ones I have ever had the pleasure of participating in.
At Hubbub, when we design for motivation we usually approach it from the lens of Self-Determination Theory (SDT) which puts the needs and desires of the user at its centre and provides us with a good framework to start ask questions from. The basic needs for intrinsic motivation according to SDT are:
- Autonomy, allowing people to control their own directions
- Competence, helping people become better in things and giving them feedback on it
- Relatedness, situating people’s actions in a social context
When it comes to the recording and application of data there are a ton of hairy issues1 that are not going to be solved anytime soon. We think most of these issues can be mitigated by designing for users and keeping their autonomy front and centre. For companies operating in this space, which is turning into most of them, properly designing the behavioural aspects of your products is the only way to guarantee that they will be used in the future.
- An in depth exploration of data ethics is beyond the scope of this piece but just to list some of the issues here: insight into the obvious, interpretation errors, overmodeling causing poorer results, juking the stats, fallacy of informed consent, the certainty of scope creep [↩]