Google Duplex: Inside Google’s Latest Voice Technology and What It Means For AI

By Riley Panko
6 Min Read
Google Duplex: Inside Google's Latest Voice Technology and What It Means For AI 1

People are unnerved by artificial intelligence that’s indistinguishable from a human. Google Duplex is a new voice synthesis technology that uses bots to mimic human speech.

Designed to help people make simple calls, such as booking dinner reservations, the AI aims to free people from the monotony of simple, repetitive phone conversations. While the technology is impressive, its eerie humanity is disquieting to many.

Unveiled at the Google I/O conference last June, the assistant proved capable of realistically copying human conversation. It gave a greeting, offered key details about a dinner party, then a cheery goodbye.

Advertisement

When the bot did not identify itself as a robot, callers couldn’t differentiate between the AI and the human. Duplex’s AI is promising, but some find such lifelike automation in everyday life uncomfortable.

This article explores the Google Duplex technology and what it means for the future of AI.

Duplex Available But Limited

Duplex is now available on Google’s Pixel smartphones in select cities such as New York, Atlanta, Phoenix, and the San Francisco Bay Area. The bot calling function works only for restaurants without online reservation systems, but Google also plans to roll out Duplex AI to hair salons in the future.

Restaurants can opt out of receiving Duplex calls when called by Duplex. Alternatively, the restaurant can manually remove themselves from Duplex calling via the website that holds Google’s search and maps services for businesses.

Google engineers are optimistic that this convenient feature will free up people’s time and help businesses bring in more customers. Conversely, restaurant owners have voiced concern that such convenience will lead people to overbook and flake on reservations.

Data about Duplex’s adoption and user satisfaction are forthcoming.

How Human is Duplex?

Google Duplex can be easily confused with a human, unlike Apple’s Siri or Amazon’s Alexa. The AI’s human speech pattern even injects “ums,” “umm-hmms,” and “ahs” to enhance its authenticity. This makes it the first AI that can operate in daily life without recognition.

Google purposefully made the conversations as lifelike as possible to streamline communication. Human-sounding AI hopefully facilitates more natural conversations.

“The person on the other end shouldn’t be thinking about how do I adjust my behavior. I should be able to [speak normally] and the system adapts to that,” explained Nick Fox, the product design executive behind Google Search.

Yet, the AI could create friction with consumers who prioritize human interaction when speaking with a business. Research shows that when presented with an automated phone, the “option for human interaction” is the most important quality to people, as seen below.

Google Duplex: Inside Google's Latest Voice Technology and What It Means For AI 2

While Google Duplex made sound lifelike, it ultimately is still a robot.

Google’s policy is to have the bot disclose itself as inhuman when making calls. Now, Google Duplex calls open with the bot informing the receiver that they are Google Duplex and that the call is being recorded.

If the receiver doesn’t want to be recorded, the call is transferred to a non-recorded line. This recently added protocol shows Google’s sensitivity to concerns about how the recordings were taking place, as former callers were being asked to give consent.

Duplex isn’t wholly automated, however, as human call center workers will join the call if difficulties arise. Interruptions and off-topic questions, such as those about food allergies, have proven enough to derail the bot. Test call recipients report being highly disoriented when a human voice joined the bot call, though.

The adoption of Duplex will be inversely proportional to how unsettling the average call experience is.

Duplex AI Going Forward

AI such as Duplex won’t be unique for much longer. Google openly published its technology, allowing Apple and Amazon and other companies to launch their own voice assistants.

Google recently announced that Duplex will soon be able to screen calls on Pixel phones. This new functionality will greet callers with an unmistakably synthetic voice, then ask them to describe their reason for calling. A live call transcript will appear on the phone’s screen, so the recipient can choose whether or not to pick up.

The technology’s applications are all positive, though. Consumers are already plagued by billions of robocalls every year, and Duplex’s AI could be tuned for malicious purposes. “You can use this for sales, you can use this for social engineering attacks,” said Roman Yampolskiy, director of the cybersecurity laboratory at the University of Louisville.

Duplex’s benefits have yet to be realized, but some say its security risks are apparent. “People will find ways to use this technology that we can never anticipate,” said Yampolskiy. To guard against misuses like that of robocalling, legislation can help to regulate how bots represent themselves and how businesses deploy them.

What Does Google Duplex Mean For AI?

The future will be full of AI, but only that which enhances people’s comfort and facilitates ease of use.

Consumers seek convenience, but also human contact with businesses. To see robust adoption, Duplex will need to alter people’s expectations about the relationship between AI and business communications.

Share This Article
Riley Panko is a Senior Content Developer and Marketer at Clutch, a Washington, D.C.-based research, ratings and reviews platform for B2B services. She conducts relevant research that aims to help consumers enhance their business and select the services and software best-suited to their needs.