As a kid, my first brush with computers was through a Pentium desktop with a mouse and a keyboard. I am still most comfortable using a mouse and a keyboard for doing my job. However, my cousin’s 5 year old has been introduced to computers through an iPhone. He is so familiar with Siri that it doesn’t even occur to him to type or text to communicate with the machine for playing games, listening to music or watching his favourite cartoons.
This makes me wonder if future tools need to address changing communication behaviours of the future workforce. For example, a workforce that will communicate with systems and machines by either talking to them, using gestures or facial expressions. In other words, a workforce communicating with machines in natural language using conversational interfaces.
Here are some interesting statistics I came across that make me believe that we are already on a path to transition to conversational platforms:
- Greater than $1 million of Snap Travel’s hotel bookings happened through their Messenger chatbot.
- Facebook Messenger has over 100,000 chatbots operating on its platform.
- As on date, there are nearly a million users on Microsoft’s Cortana digital assistant.
- 60 million Alexa devices are being forecasted to be sold by Amazon by 2020.
To top it all, Gartner’s Top Strategic Predictions for 2018 states that “By 2021, early adopter brands that redesign their websites to support visual and voice search will increase digital commerce revenue by 30%. Also, by 2019, half of major commerce companies and retailers with online stores will have redesigned their commerce sites to accommodate voice searches and voice navigation.”
The current breed of conversational interfaces are mostly limited to the ones providing a list of items that initiating users can select.
Certain others use a simple Q&A interface to respond to queries that contain specific keywords. Futuristic conversational interfaces are metamorphosing into Artificially Intelligent Agents that engage with humans in natural language - both in corporate and end-consumer communication.
A prototype for a futuristic conversational interface was demonstrated earlier this year on Google I/O 2018. Google’s CEO, Sundar Pichai, mesmerized everyone by demonstrating an enhanced version of Google Assistant, aptly called Google Duplex. The demonstration showcased a phone call made by Google Duplex to a hair salon for fixing an appointment for a human. Google Duplex’s responses and voice quality were so natural that the lady from the hair salon spoke like she were talking to a human.
Most technologies deployed today complete pre-specified tasks, such as scheduling appointments to take a pizza order.
However, Google also plans to have Assistant call businesses for inquiring about their operational business hours, to help keep Google Maps listings up to date.
Thickstat’s conversational AI platform, ConverSight.ai, uses adaptive analytics to enable corporate stakeholders to use voice commands to ask complex questions to Business Intelligence (BI) interfaces like sales forecast, revenues by geography, and even sales trend of individual items. This enables even casual, untrained BI users to use it effectively.
Sales managers spend hours sifting through sales data and prepare a review report for senior management. With such advanced conversational analytics, they can do the same in seconds.
The interface of such tools will understand the context and intent of the business query by the user leveraging machine learning, data science, knowledge graph and cognitive techniques to provide personalized human interaction. The impact and adoption of Conversational AI interfaces will depend on the user friendliness of the solution and the accuracy of the use cases developed.
If you would like to know how such systems can help your organisation, write to us at firstname.lastname@example.org