Coordinated by the Istituto o Italiano di Tecnologia (IIT) in Italy, An.Dy (Advancing Anticipatory Behaviors in Dyadic Human-Robot Collaboration), is a Horizon 2020 European project set up to research a branch of collaborative robotics. The project focuses on the nuances within human-robot interaction, aiming to study the effect on ergonomic factors when a user needs to intervene and change a robot’s behavior. The research also includes the estimation of human motion dynamics with associated data set collection, the modelling and control of human-robot physical collaboration, and the deployment of the technologies into real scenarios.
To find out more, we spoke with Daniele Pucci, Head of the
Dynamic Interaction Control
research Line, and Principal Investigator at An.Dy, about the project team’s use of inertial motion capture technology, and how the analysis of human motion informs their research.
Effects of Motion
Within An.Dy, the study of human perception is divided into two levels: kinematics (the study of motion) and articular stress (the impact on bone or cartilage). By using Xsens’ MVN Analyze motion capture system with An.Dy algorithms, the team can estimate human kinematics.
“We have to perceive the human to prepare the robots. Together, both the human and robots work on shared tasks, such as carrying an object. The problem is working out the human behavior when circumstances change, and the movement adapts.”
“We study kinematics to accurately locate the positioning of the human, but we also need to understand the degree of articular stress experienced to assess ergonomic safety. At these perception levels, we use Xsens to retrieve the orientation of the limbs in the human, and then we apply our estimation algorithms to obtain human posture and articular stress,” said Daniele.
While the key function of collaborative robotics is to assist the human with specific movements, the advancement of the technology is leading to situations which require room for adaptation. An intentional interaction is characterized as an intervention by the human that changes the robot’s behavior. Engineering robotics that can account for these events is complex due to their unpredictability—what does it mean to the robot if the human wants to stop the function, and how might this affect ergonomics?
“Let’s say a robot is trying to stand up from a chair and the human decides to help it, we consider this an intentional interaction. In our research we have two domains, the first is intentional interaction, how can a human help a robot? And the second is ergonomics and human prediction, how can a robot help a human?” Explained Daniele.
By researching human movement, An.Dy can provide data to engineers so that the robots are safer to use—the research also provides instructions on optimal human positioning. The freedom to analyze movement in real-world environments makes inertial motion capture an invaluable tool for assessment, especially when the data is applied in real working conditions.
“We’re trying to implement ergonomic controllers that can optimize the human postures during collaboration tasks. For instance, if a robot has to lift a box together with a human, we find out the optimal position of the box so that the articular stress of the human can be reduced.”
“We can conduct our research in environments specific to the task—this means our data will have greater accuracy. Once we were introduced to Xsens, we haven’t stopped using it.”
Researchers at An.Dy use analysis software to compile, process and present their data. Using inertial motion capture to obtain the kinematics, Daniele’s team then utilizes their own middleware, YARP (Yet Another Robotic Platform), and visualization software, ROS (Robot Operating System).
“We use Xsens as the main platform for unit perception, then we have our libraries for human kinematic articles. Finally, we use our own middleware called YARP which integrates with ROS, a Robotic collaborative system, allowing us to visualize the results of our outcomes.”
“We’re looking to merge other technologies into the Xsens framework in the future, such as sensorized insoles.”
An.Dy has a range of research projects in development—one of which is a fairly ambitious project that far exceeds previous collaborative robotics research. Daniele aims to study a large group of people paired with robotics in a working environment, examining each individual’s biomechanics simultaneously. This will require an unprecedented level of coordination as the researchers attempt to study the perceptions of each human and how they preempt or prompt each other, all while analyzing the robotics.
“We want to extend our algorithms to the crowd level so that we can analyze multiple humans at the same time. At the moment, you can only have one human and one robot at any one time; the human-robot collaboration is one-to-one. We have envisioned an opportunity to study multiple humans with multiple robots, and humans wearing Exoskeletons. All of them have to operate towards a common objective. The question is: how can the control and estimation of all of these people be handled?” Exclaimed Daniele.
With innovation at the heart of An.Dy’s research, the project is helping drive the increasing implementation of collaborative robotics across Europe effectively and safely, with inertial motion capture providing the data needed to conduct their research.
MVN Analyze software trial request
You can request a 15 day trial for the MVN Analyze software by filling out your details.
European Union's Horizon 2020
An.Dy has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement No. 731540.