Workflow of human activity recognition
In line with initiatives such as Industry 4.0 in Germany and Society 5.0 in Japan, the manufacturing industry is accelerating steps towards innovating production using AI and robotics, and the automation of menial tasks. At the same time, IoT technology is being called for to collect and recognize the condition or movement of all things, including people and equipment, to assist in operations and prevent human error. As a result, in recent years, monitoring systems using cameras have been developed for predictive diagnosis of inappropriate worker movement or equipment failure in production lines.
Researchers from the DFKI research department, Smart Data & Knowledge Services and Hitachi developed AI for human activity recognition to recognize the activity of workers using various data collected through the wearable devices, not image data from cameras. Features of the AI developed are as below:
1. Technology to recognize gazed objects by using eye-tracking glasses
This technology is to recognize targeted objects like “screw” or “screwdriver” without being disturbed by its surrounding environment such as background or other objects. This technology extracts the data of gaze points from the movements of eyeballs of workers who wear the eye-tracking glasses and utilizes the image recognition technology by Deep Learning.
2. Technology to recognize basic human actions through armband device
This technology is to recognize basic human actions that require arms movements such as “twist” or “push”. This technology extracts the data relating to body actions from the microscopic and instantaneous signals that are measured by sensors attached to the arms.
3. “Hierarchical activity-recognition model” that recognizes workers’ activities by integrating gazed objects and human actions
This technology integrates the two technologies mentioned above to develop “hierarchical activity-recognition model”, which is to recognize activities such as “twisting a screw.” As a result, recognizing a variety of working activities is capable if all the actions and objects involved in the activities are learned in advance.
Based upon these technological developments, the AI technology that can recognize activities such as “twisting a screw” or “pressing a switch” as part of “inspection task” in real-time was realized. DFKI and Hitachi will advance the technological development for assisting operations and preventing human error on the front line of manufacturing, where operation guidance and inadequate action detection are required, by utilizing this newly developed AI.
DFKI and Hitachi will exhibit a part of this technology at “CeBIT 2017”, a leading global exhibition of digital business to be held from 20-24 March 2017 in Hannover, Germany.