The U.S. Army has suggested guidelines for designing autonomous machines, such as self-driving cars, drones and personal assistants, that promotes cooperation between soldiers.

A research paper by computer scientist Dr. Celso de Melo of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory and Dr Kazunori Teradafrom Gifu University in Japan showed that emotion expressions could shape cooperation.

He said that autonomous machines developed to act on behalf of people are poised to become more pervasive in human life. However, people must be able to trust and cooperate with these machines for them to succeed and be adopted into society.

He added that human cooperation is paradoxical. One person can be better off being a free rider while others cooperate. However, if everyone thinks like that, then cooperation will not happen.

(Photo : Pixabay)
Emotion Expressions Can Shape Cooperation Between Future Autonomous Machines and Soldiers



Emotion expressions can shape cooperation

The research aims to understand the mechanisms of promoting cooperation, particularly on the influence of strategy and signalling.

The strategy is the way individuals act in repeated or one-shot interaction, such as the tit-for-tat which is a simple strategy that identifies that the individual should act as the counterpart acted in the previous interaction. Meanwhile, signaling is the communication that occurs between individuals, and could either be verbal or non-verbal in nature.

The research supports the Next Generation Combat Vehicle Army Modernization Priority as well as the Army Priority Research Area for Autonomy. It aims to apply the insights from the study in successfully operating hybrid teams to accomplish a mission by developing autonomous machines that promote cooperation with soldiers.

According to de Melo, their research shows that emotion expressions can shape cooperation. For example, more cooperation is encouraged when one smiles to their comrades after mutual cooperation. But smiling after exploiting someone hinders cooperation between members of a group.

Emotion expressions' effect is moderated by strategy. That means people will only process and be influenced by emotion expressions when the actions of the counterpart are insufficient to reveal their intentions. For instance, people would ignore and mistrust the counterpart's emotion displays if the counterpart acts very competitively.

READ: AI Wrote An Op-Ed Convincing Humans That Robots Will Not Replace Humans


The future of human-agent teams

The new research provides new insight into the combination of strategy and emotion expressions on cooperation, said de Melo. It has an important practical use on designing autonomous machines and suggests that with the proper combination of action and emotion expressions, it can maximize cooperation from soldiers.

Emotion displays in these systems can be in different ways like text, voice, and nonverbal by using robots.

According to de Melo, they are very optimistic that the research will benefit future soldiers as this sheds light on how cooperation is best produced and used. This insight is vital in developing socially intelligent robots that are capable of behaving and communicating nonverbally with the soldier.

The research also has great potential in enhancing the future of human-agent teaming in the Army. But for now, the next steps for this research is to pursue further understanding of the role of nonverbal signals and strategy in encouraging cooperation and determining the best ways to apply the research on various autonomous machines.

READ MORE: The Age of Robots and The Upcoming Robotics Trends To Expect This 2020

Check out more news and information on Robots on Science Times.