The Future of Bots
How many times a week do you find yourself conversing, talking to, or instructing a bot?
When you call your bank, a bot picks up. She asks for your identifying number and you trust her. Maybe you inherently have no reason to, and some would argue that gives you less of a reason to trust her. This is so common in businesses execution of customer support that it’s more or less accepted.
With more advanced voice systems, we start seeing in Siri, Alexa, and all the way back to Macintosh and Microsoft Sam. We’re picking up steam in breaking down the barrier where it’ll be hard to differentiate if the tech support on the other side of the phone is human or infact a bot.
Past and Present
To better understand the future, let’s take a step back. When you imagine ‘Bot’ we usually imagine something cheap and mechanical, yet physical. A toy more or less. It only has one task, which it does rather poorly. These are limitations of imitating human-like actions. But what about conversation? Probably our most advanced evolution, and something very innate when it comes to being human, is being able to exchange thoughts and ideals.
In 1988 a chatbot project began, to exceed what people thought they knew about bots. Using AI paradigms, white papers and algorithms for what developers would later understand as Natural Language Processing. The project went online to the public in 1997 as cleverbot. With physical and vocal limitations out of the way, people were able to hold, a rather simple, conversation with some code. This project came across as something very simple and clever, but also a toy at first sight. After all, what could this mean for the future of bots?
Having a conversation can seem challenging depending on how far a developer and his team is willing to research and develop a bot that can fool us. However, outside of conversation, a large amount of written text that we see on Wikipedia has been edited and contributed by a bot. In fact, this has been going on for so long, a few bots have been at war with each other in contradicting themselves on what should or shouldn’t be included in an article.
In more recent times, What initially arrived on the scene with Dragon with Google Home and Alexa supporting custom plugins with their advanced Speech To Text modules, a twitch chat stream gained large notoriety after making two Google Home bots to talk to each other, utilizing the cleverbot program.
A large part of bot development presently with products like Alexa, involves taking Speech, converting it to text, deciphering that text, either using a linear dialog flow or a more advanced one like Machine Learning Models otherwise known as AI. Figuring out what to do with text input, convert the output text to audio with Text to Speech. Currently only the larger companies like Amazon, Microsoft and Google have support for a large collection of Speech to Text API’s. Converting Text to Speech is simple and has been around for decades. Speech to Text is, however, is more advanced. Creating an advanced program that can understand speech is not easy, and has taking a large amount of research to present as a product that these companies can charge for, even though, they don’t work as well as developers would like. Between the many different vernaculars and languages. They do have support for almost every language. Although it works best in English, but not perfectly.
You’ll probably answer your phone to more advanced bots in tech support and customer service. This may or may not be in the favor of larger companies. Insurance companies are, in my opinion, built around their customer support. When I call in, I almost always get a nice lady who lives in Kansas, and she seems overly worried for me and my predicament, or even simply get me a quote on a motorcycle. She even sprinkled in some anecdotes about something that happened in her area. Either way, I can’t say that insurance companies will go towards bots until they are more advanced at comprehending tragic events, and providing sincere dialog. What a weird timeline that would be. It is possible, however.
The thin line between AI and Bot is often smeared and we can easily lose track of which is which. With Teslas’ successful line of Self-Driving cars, we can say with certain this is the future of AI, but is it really a bot? Although Tesla does have a foot in the door with voice commands integrated into their dashboard. It’s similar to talking to Siri or Ok Google while you drive. We may start to see Uber and Lyft taking an approach with these commands to interface with the driverless cars. Especially in condensed and crowded cities, where you want the car to pull over, with supported commands like “Take this next left” or “Pull over here”.
As we start to see bots inside fast food venues, it’s pretty clear as to why. Ordering from a menu is pretty straightforward. Most of the time, the person on the other end has a hard time understanding me. In the future we may pull up to a drive thru and conversate with a bot through the speakers. Some of the complications there, involve keywords like “actually”, “instead of”, or “no wait”. Where the developer would need to take into account what’s being changed in your order.
Every bot developer currently implements their own memory system to some degree. How deep that can go depends largely on the budget and project type. My Father is not tech savvy, but as as a general user, will say, “Thank you” to Alexa, and no response is heard back which pains me a bit. A simple, “Your Welcome” would be nice. But that would mean: Alexa is listening without reason. So I can understand why that’s the case.
Sci Fi movies, novels and literature already depict AI that exists side by side with us. Everywhere there’s a scene where the protagonist is seen talking to AI, or in Spike Jones’ latest, Her. We see the AI as a supporting character, with artificial dreams, memories and at a romance with the main character. These types of situations are not far off, as developers start to understand how to store facts, memories, and adjust our dialogs more dynamically to these types of responses. A Japanese bot company made big waves with their Gatebox commercial depicting an average Japanese male, interacting with his holographic anime alarm clock. She remembered important dates and even initiated conversations through SMS as depicted. Wondering where he was, and when he thinks he’ll be back. Which leads most of us to read into the AI side of things. It’s very reminiscent of Her, however it’s not fantasy as it is a product with a hefty price tag, ¥298,000, or about $2600 (USD) in December 2017, as reported.
After all, with the big tech companies pushing their new bot frameworks to allow developers easier ways to create text bots. It starts to paint a picture around how many can start to learn the current progress and future limitations of chat bots. With more people working together and contributing we’re full steam ahead on creating these types of applications, as well as making it even easier for the not-so-tech savvy to create their own.