Feminine gamification viewpoint: language bias

Feminine gamification viewpoint: language bias

Artificial intelligence driven by machine learning has come up with some interesting challenges. Because of it’s adaptive nature responding to how we deal with it, it can come out different than the way it was designed for. The Microsoft chatbot Tay was an example designed to be helpful, but thanks to people sending it abusive and racist tweets and comments, it started to respond in exactly the same way.

The challenge with coding responses is that there is no emotional filter or value (right/wrong, acceptable/offensive) filter built in to most machine learning set-ups. Technically most projects aim to resolve a specific problem and they do exactly that solve that specific problem. Currently most AI developer teams are white affluent male, in fact most artificial intelligence conferences have trouble attracting more than 15% in female attendees. However the industry does need the input to understand how men and women interact differently.

Textio, a tool that helps companies change job posting language to increase the number and diversity of people that apply, analysed 1700 AI employment ads and compared those to over 70,000 listings spread across six other typical IT roles. The analysis found that AI job ads tend to be written in a highly masculine way, relative to other jobs. Some of the words used included “coding ninja”, “relentlessly” and “fearlessly” which tend to lead to fewer women applying.

I often find the same applies when I see game related roles, even when the company does their best to have a gender equality policy and method. I think tools like Textio can help just as much as role models and awareness about the issue that we all have a personal bias to our own kind.

When I take this one step further, in our designs the types of questions women ask and men ask are different. How they engage with robots or gadgets is also different. I would love to see the result of male trained robots and female trained robots and the difference in behaviour. It would prove how much environment influences these factors and it is isolated enough to be relevant.

What words have you come across that are indicative to one gender or another?

Leave a comment

Our Solutions