This article has been translated from our Spanish edition using AI technologies. There may be errors due to this process.
Looking to offer its users a better experience, Google Docs has launched a number of (alleged) improvements, including assisted text . The idea is that the application allows us to write better and faster. Now it’s not just about spelling, grammar and style suggestions, but also recommendations to avoid using them. terms that are not considered inclusive .
When the system detects a term that it deems inappropriate, the tool alerts the user that the content may “not include all readers.” Although it is a well-intentioned tool, the reality is that in practice it has been restrictive and annoying for some of the users who have already tried it.
According to the Vice website, among the terms that the English version has described as non-inclusive are “ owner (owner) “i” motherboard (motherboard) ”, although obviously they are not. The site’s editors did some testing by transcribing excerpts from famous works or speeches to see what suggestions the tool made to them, and the criterion doesn’t seem to be very effective: Martin Luther King’s famous speech, “I Have a Dream.” . “raised more flags than the transcript of an interview with Ku Klux Klan leader David Ernest Duke.
After extensively testing the tool, Vice’s team contacted a Google Docs Spokesperson , who explained that technology is constantly evolving: “Assisted typing uses language comprehension models that are based on millions of common phrases and phrases to automatically learn how people communicate. This also means that they may reflect some human cognitive biases. Our technology is always improving and we still don’t have (and maybe never have) a complete solution to identify and mitigate all the associations of spam and bias. “
Although the assisted text feature can be turned off at any time and the system does not force the author to write in a certain way, the feeling is that it tries to regulate the way it is expressed. This case shows that, despite how powerful it is, artificial intelligence and algorithms are still unable to differentiate and emulate some things that are natural to humans.