NSF Spends $50K to Improve Social Gaze of Robots

March 4, 2014 - 7:32 PM

Japan Robot Mower

In this undated photo released by Honda Motor Co., its Asimo walking talking robot shows its new product lawn mower Milmo. Honda finally has its first product for the home packed with its prized robotics technology — a sensor-equipped lawn mower. (AP Photo/Honda Motor Co.)

(CNSNews.com) - The National Science Foundation has awarded $50,000 to Texas A&M Engineering Experiment Station - the engineering research agency of the State of Texas and a member of the Texas A&M University System - to improve the social gaze of robots and develop more "'human-like' conversation."

"This proposal further develops a software package that autonomously generates head and eye movements, in a virtual assistant or social robot synchronized with real-time speech for 'human-like' conversation," the grant award abstract said.

"Existing solutions use preprogrammed head and eye movements for conversation, as current technology cannot synchronize head and eye gaze for real-time speech. The proposed technology can be used in interactive open-ended conversations and can adapt to gender and culture of the conversation partner," it added.

"The proposed technology can be used in interactive open-ended conversations and can adapt to gender and culture of the conversation partner," the grant abstract said.

"The broader impact of the project include the validated benefits of the technology for end users, such as increased social acceptance, increased positive feelings, increased engagement, improved understandability, and superior likeability," the abstract said.

The proposed technology may be "transformative in a large number of markets like video games, online marketing, web customer service, telepresence, telemedicine, entertainment, eldercare, and healthcare."

It could also lead to "substantial cost savings to organizations that deploy virtual assistant or social robot solutions, and increased revenues for the vendors of these solutions due to accelerated consumer adoption."

"This project presents three contributions to human-computer and human-robot interaction. First, autonomous generation of head and eye gaze in a virtual assistant or social robot synchronized with real-time speech for open-ended interactive conversations," it said.

"Second, the front end of the social agent responsible for gesture generation is completely independent of the back end knowledge base; hence deployments of virtual assistant or social robot solutions are easier, cheaper, and faster. Third, the technology enables generation of social gaze sensitive to gender and culture," it added.

The grant titled, I-Corps: Social Gaze for Software Agents and Robots, began on Feb. 1, 2014 and expires on July 31, 2014.

CNSNews.com contacted Robin Murphy, principal investigator of the grant at Texas A&M, about the grant and technology and NSF for additional information about the I-Corps program, but calls and emails were not returned by press time.