Researchers at Facebook Artificial Intelligence Research built a chatbot earlier this year that was meant to learn how to negotiate by mimicking human trading and bartering.
But when the social network paired two of the programs, nicknamed Alice and Bob, to trade against each other, they started to learn their own bizarre form of communication.
The chatbot conversation “led to divergence from human language as the agents developed their own language for negotiating,” the researchers said.
The two bots were supposed to be learning to trade balls, hats and books, assigning value to the objects then bartering them between each other. Below is the actual conversation between the 2 robots…
But since Facebook’s team assigned no reward for conducting the trades in English, the chatbots quickly developed their own terms for deals.
“There was no reward to sticking to English language,” Dhruv Batra, Facebook researcher, told FastCo. “Agents will drift off understandable language and invent codewords for themselves.
“Like if I say ‘the’ five times, you interpret that to mean I want five copies of this item. This isn’t so different from the way communities of humans create shorthands.”
After shutting down the the incomprehensible conversation between the programs, Facebook said the project marked an important step towards “creating chatbots that can reason, converse, and negotiate, all key steps in building a personalized digital assistant”.