According to the email that dropped into my inbox just now, Alexa knows how to drop a beat. All you have to do is ask ‘Alexa, can you get funky?’ and she’ll show off her skills. I haven’t tried this yet – I turned my Echo off a while ago. After not being able to find a single useful use case for her, I figured the potential privacy implications weren’t worth being able to turn the music on and off with a command occasionally (I say occasionally because background noise or just general tech-fuckwittery often got in the way).
Besides, anyone with an Echo knows that you spend quite a lot of time trying to figure out the correct form of words to get what you want from Alexa. One of my regular clients has one in the office, and watching the disappointment on the Head of Development’s face when he asked ‘Alexa, play some good music’ and got the response ‘Here’s some music you might like: 80’s Rock Radio’ was very entertaining, but not, ultimately, what he was after.
It got me thinking: we all know that teaching AI to understand the infinite variety of human language is a very difficult task. Parsing the exact meaning from people’s choice of words is often difficult for other humans without the extra input from body language, and that’s before we even get on to tricky concepts like irony and sarcasm. But it occurred to me that what the weekly email from Amazon is doing is trying to teach me a new language: Alexa language. It’s a specific, task-oriented language in which phrases are tied to actions in a one to one relationship. So “Play some good music” means something like “Play a station based on an algorithm that reads previous choices and generates a suggestion.” The same phrase doesn’t (yet) have multiple meanings based on intonation, facial expression, context or assumed intent, as it would in a human-to-human interaction.
I wonder what this will do to our language generally? We know that language is a continually evolving thing that mutates when it comes into contact with other influences: you only have to count the number of words in the English language that originate from other sources to see that. And I’ve seen some interesting posts about whether we should make Alexa only respond to polite enquiries, so as not to teach our children rude and peremptory speech. So at a basic level, as AI grows in sophistication and power, we should expect some new words related to this technology to enter our language. But as our interactions with machines become more commonplace and numerous, will we start adapting our speech in all circumstances to this clearer, less nuanced style – perhaps something like the way we talk to a foreigner who isn’t fluent in our language? Will dual meanings and ambiguity go out of the window? (Please tell me we’ll still use puns – I do love a terrible pun-based joke.)
Much of the focus of discussion seems to be around getting AI to adapt to and work with ‘our’ world – but I think we’d be naive to think that we won’t be changed by it, and in some fairly significant ways.
If you’re reading this, I really want to know – how do you think AI will change our language? What words are currently only used in the industry that will become common parlance? Will we become less complex in our speech? And what about the language of emotion – a minefield even for human beings, let’s face it. Tell me in the comments or tweet me @CeciliaUnLtd.
Also published on Medium.