Machines are found in more and more application areas where they help people make better decisions. International Data Corporation (IDC) anticipates global expenditures of more than 40 billion US dollars for cognitive solutions by the year 2020. Fraunhofer wants to achieve more effective pooling of the activities in relevant research fields and systematically promote them.
As a shopping assistant, service robot Paul recently welcomed customers in an electronics store, asked which products they wanted, and escorted them to the appropriate shelf. On the way, he chatted about the weather and then asked a few feedback questions, for example, whether the customer was satisfied with his service. “He operates in a dynamic, everyday environment in which he recognizes objects or people and reacts to them,” explains Martin Hägele, head of the Robot and Assistive Systems Department at Fraunhofer IPA. The robot uses sensors to gather information about its environment in order to navigate dependably at all times and to locate people to conduct dialogs.
Paul contains the Care-O-Bot4® robot platform, which Fraunhofer IPA originally developed for active support of people in households, hotels, nursing homes and hospitals. Now plans call for increased deployment in companies.
Understanding environments, planning actions, reacting to obstacles, communicating with people - cognitive systems master these challenges by harnessing machine learning methods. Here, machines learn to solve a task on the basis of example data and to transfer what they have learned to new situations. For example, they can plan and optimize processes, make forecasts, recognize patterns or distinctive features, and analyze image and voice signals. These systems form the basis for important future technologies such as autonomous driving or autonomous robots.
Thanks to the software SHORE from Fraunhofer IIS and "affective computing", shopping assistant Paul can even recognize a person's mood and express his own state of mind. This involves machine recognition of emotions in facial expressions, sensor data merging, and analysis of biosignals, such as pulse, voice, gestures or movement. For example, the stress level of car drivers or factory workers can be determined, as can customer wishes or requirements. “Analysts regard affective computing as the commercially fastest growing market in the machine learning field,” explains Dr. Jens-Uwe Garbas, head of the Intelligent Systems Group at Fraunhofer IIS.