Another way of saying this is that machines don’t have common sense. ELIZA, a chatbot created in the 1960’s at MIT Artificial Intelligence Lab could act as a Rogerian psychotherapist.
Chatbots, in other words, may be great at ordering stuff from Amazon or telling you to put a coat on because the forecast says it is going to rain, but they are nowhere near ready to help you fix your technical problem. Uttering comforting platitudes to the broken hearted is not the height of intelligence. Solving complex technical problems is much more complicated because the problem area is much more diverse.
A coach can watch you execute a maneuver and critique your form. Even if it could, the ability of the AI to make sense of what it is seeing and put it in the context of the user’s task just isn’t there yet. It requires you to understand and articulate the problem, and by the time you can understand a problem well enough to articulate it, your are well on your way to solving it. And you can’t tell it what the problem is because you cannot see the problem, only the result of the problem.And suddenly every tech comm and content strategy conference seems to be about getting your content ready for chatbots. Chatbots are sexy and sex sells, even if the definition of sexy is a grey box with a speaker sitting on the counter.But chatbots are not the future of technical communication. As Will Knight writes in Tougher Turing Test Exposes Chatbots’ Stupidity in the , current AI does barely better than chance in deciphering the ambiguity in a sentence like: “The city councilmen refused the demonstrators a permit because they feared violence.” (Who feared the violence?Would you prefer this interaction over a video game that actually shows you the forest and the stream tumbling along a rocky bed?(Or, you know, going outside and actually seeing a forest and a stream tumbling along a rocky bed?