Sgt Star is a multi-million pound chatbot project that currently answers over 1500 questions per day from candidates interested in joining the US Army.
This is described as an Army virtual guide. Here we answer the question. And to answer quickly.
In this 300-page PDF, you can see how SgtStar is ready to create the response it can provide.
SgtStar chatbot
Lessons learned from this military chatbot
Chatbots are a very effective cost-cutting measure, reducing Cork Bicycle Repair Zone the amount of time operators spend answering everyday questions.
The Army estimates that chatbots will have the same workload as 55 recruiters.
However, using chatbots can raise questions about user privacy and the recording and tracking of chatbot conversations.
5. And finally, when the chatbot doesn't work
And to conclude the debate, there are quite a few well-published examples of when chatbots don't work.
One of the most well-known examples is Microsoft's artificial intelligence chatbot Tay. Designed to emulate the way young women communicate, Tay participated in a conversation in a social space.
However, the bot was not closely monitored and quickly went out of control under the influence of racists, vandalism, and the execution of factory troublemakers.
The experiment was about "conversation learning," in which the bot emulates the conversation around it.
And, as you can imagine, it wasn't hard to learn how to influence a chatbot to spew offensive messages.
I'm posting only one screen-grabbed (and deleted) message from my TayTweets account, but I'm sure I can understand the general point.
taytweets
Lessons learned from Tay Tweets
The field of conversational business is still new and is a technology under development and needs to be closely monitored.
Chatbots are the subject of attempts to manipulate your good intentions
Companies need to plan to mitigate the risks associated with chatbots