Google’s made its AI less biased by banning gendered pronouns

After introducing Gmail's new predictive text feature 'Smart Compose' in May, Google has shed light on how it makes the feature work for all users, revealing to Reuters that it banned all gendered pronouns from the tool in an effort to make it more inclusive. 

So, if you type out the words 'I love', Smart Compose may suggest the words 'it' or 'you' to complete the sentence – but it will never suggest 'him' or 'her'.

This is because Google is worried that its predictive technology will inadvertently offend users by suggesting the incorrect pronoun. Although it sounds like a harmless mistake, the repercussions of biased technology could be huge, as one of Google's research scientists discovered in January:

When the researcher typed “I am meeting an investor next week,” Smart Compose suggested a possible follow-up question: “Do you want to meet him?” instead of “her.” Not only could this lead to embarrassing gaffes, but it also reinforces outdated gender stereotypes and could lead to some users feeling alienated by the technology.

So, the company decided to cover its back by removing any trace of gendered language from Smart Compose. However, it still has some way to go to remove bias in its search engine's autocomplete function, which has thrown up anti-Semitic and racist results in the past.

Google’s Smart Compose tool in action

Staying composed

Smart Compose was announced at Google's I/O conference in May,  and it's designed to make writing everyday emails quicker than ever before.

Like a smarter version of predictive texting on your smartphone, it using AI technology to analyze what you've typed so far and then suggests which word you might type next based on messages you've written before.

Although it has a little way to go yet, Google hopes that one day the technology will make it possible to compose entire sentences with just one tap of your keyboard.

Via VentureBeat

Leave a Reply