- Companies are making the voices of AI-based systems indistinguishable from humans, improving inflection and expressiveness.
- Integrating conversational and other self-service capabilities into business opens valuable data up to workers outside of traditional data analysis roles.
On Sunday, Microsoft announced the acquisition of Semantic Machines, a conversational AI company leveraging machine learning to make interactions between users and information and services more natural.
Founded in August 2014, the company built technology for Apple’s voice assistant, Siri, and Google Now. It works on technologies including natural language processing, deep learning, speech recognition and synthesis, semantic understanding and linguistics, according to the company’s website.
The acquisition comes two weeks after CEO Satya Nadella announced the addition of more than 100 features to the Microsoft bot framework to help customers build and customize conversational AI tools, reports TechCrunch. Whether hosted in Azure or another location, the updates help customers build and run bots across platforms, including Microsoft Teams and Cortana, Facebook Messenger, Slack and other websites or applications.
Voice assistants are increasingly popular at home and in the office, with an estimated 175 million smart speakers expected in American households by 2022. The Google Assistant, which has outperformed its competitors for two years running in accuracy, had the world questioning if the Turing test was beaten after a recent demonstration at the Google I/O conference of it calling a restaurant and hair salon.
Want to publish your own articles on DistilNFO Publications?
Send us an email, we will get in touch with you.
Some proprietary AI systems have already reached human parity in speech recognition, and algorithms are getting better everyday at making voice interactions with people more lifelike.
Companies are making the voices of AI-based systems indistinguishable from humans, improving inflection and expressiveness. And Amazon has gone so far as to give Alexa a personality, coding her to ESFJ on the Myers-Briggs scale and setting her celebrity crush as Benedict Cumberbatch.
But the implications of conversational AI extend far beyond easing the bridge between customer and machine. The enterprise is heading in the direction where workers can have a conversation about company data with a machine, according to Pradeep Kumar, senior big data architect at Lenovo, in an interview with CIO Dive.
This process is already taking root. Microsoft, for example, uses conversational bots to help workers ease daily burdens of scheduling or staying on task. And JPMorgan Chase is opening up analyst reports and research to customers through Alexa, with considerations to expand the partnership to more features, such as bond and swap price reporting.
Salesforce, for example, introduced Conversational Queries to its AI platform, Einstein, in March, allowing users to type in informal queries about their data such as, “show me the top accounts by annual revenue” and receive visualized answers.
Integrating conversational and other self-service capabilities into business opens valuable data up to workers outside of traditional data analysis roles. Untapped data is a wasted business resource, and these tools hold the potential to unlock valuable business insight and profit.
Date: May 21, 2018