In light of all the unravelling of data manipulation and influencing of behaviour by data and behavioural scientists in light of elections and votes in both the UK and USA, I would say it is important for those of us using data, game psychology and game elements to drive behaviour change to create transparency around the purpose of our designs. What is fundamentally wrong in the case of data manipulation and behaviour influencing, is when it is carried out without the consent of users. If I actively opt-in and that means understanding the consequences of opting, then we go into the deal with our eyes wide open.
Terms and conditions and the little statement that you are giving access to your profile data, in my view are written to protect the person looking to access your data, rarely the end-user. Think about it, when is the last time you read through all of the terms and conditions before downloading an app or activating a game or the pop-ups that come with Facebook quizzes? Here is where there is room to improve messaging first of all from legalese to something a layperson can understand or better again questions that explicitly state “are you happy to share your personal information and that of your friends to influence their buying, voting or other behaviours?”. Most even short questions only ask if you are happy to share data, but rarely for what purpose.
In each gamification design project, we look for the intentions of the project and what clients want to achieve with it. Typically they are harmless and above board reasons, such as more productivity, more engagement, more learning, more sales, etc. Actions, which you would associate as normal in their given context. I think even here, we could ask the question “Are you happy to allow your data to be used to encourage you to be more productive, learn more, buy more etc.?”. Again allowing the end-user the choice to opt-in or out. When they opt-out, their data will not be recorded and we need to ready to create a default journey that may include game elements but not tailored to the individual.
I am a big fan of algorithms, artificial intelligence and I see them mainly as a way to enhance what we do as humans. Anything in the wrong hands is off course a little dangerous. Hence the need for transparency of intentions. On a lot of lottery or casino sites, you will find limits to how much a person can play a day. Maybe this is also important in the case of shopping, we have actually heard stories of learning systems having to be shut during peak customer time so people would spend time where they were needed most. So an element of guidance and if you will, parental control could be useful.
Some people may argue that by making a design transparent, that you give away the game. If your intentions are pure, then I see no harm in that. In any case, you want people to know how they can win. Keeping an element of suspense and surprise will still be possible as you don’t reveal each of the rewards or random steps along the way. What you do want to make clear to end-users, is the intention of the overall design. So for example, if we design a gamification campaign for on-boarding of new hires, then the intentions could be to increase the on-boarding speed, have the individual learn a lot about the company, learn about the way work is delivered and how cultural values play out.