Computers and robots could soon be having meaningful conversations and even arguments with humans,potentially within the next three years,scientists have claimed.
Researchers from the University of Aberdeen will develop systems that allow men to debate decisions with robots — opening up the possibility of human operators discussing action plans with robots,a media report said.
The aim is to increase human trust in intelligent technology — and that early versions of the software could be available in just three years, lead researcher Dr Wamberto Vasconcelos,said.
Autonomous systems such as robots are an integral part of modern industry,used to carry out tasks without continuous human guidance, he said.
Employed across a variety of sectors,these systems can quickly process huge amounts of information when deciding how to act. However,in doing so,they can make mistakes which are not obvious to them or to a human, he said.
Evidence shows there may be mistrust when there are no provisions to help a human to understand why an autonomous system has decided to perform a specific task at a particular time and in a certain way, he added.
What we are creating is a new generation of autonomous systems which are able to carry out a two-way communication with humans, Vasconcelos said.
The system will communicate with words on a computer screen rather than speech. Potential applications could include unmanned robot missions to planets or the deep sea,defence systems and exploring hostile environments such as nuclear installations.
A typical dialogue might involve a human operator asking a computer why it decided on a particular decision,what alternatives there might be and why these were not followed.
One factor that has to be taken into account is ensuring the computer’s responses do not seem threatening,rude or confrontational.
That’s something we’re going to have to look at, Vasconcelos said.
Conversing with robots would actually make humans more accountable,since failures could not conveniently be blamed on computer error, he added.