Monash tackles prediction gap stalling Industry 5.0

Published on the 26/03/2026 | Written by Heather Wright


Monash tackles prediction gap stalling Industry 5.0

Making robots workplace ready…

How do you ensure robots and humans can work safely together, side by side, with no separation required?

That’s the issue Monash University faced and is now looking to solve.

“That will lower the hurdle for industry to use these models.”

Yunlong Tang, assistant director of the Monash Centre for Additive Manufacturing and senior lecturer in mechanical and aerospace engineering and materials science and engineering, told iStart that despite a surge in robotics adoption, most Australian and New Zealand factories continue to separate humans and robots with physical cages. That’s something he says isn’t the best option as workflows demand greater speed, flexibility and human involvement.

Tang is the co-author of a new international review on human robot collaboration, exploring how manufacturers can make human-robot collaboration safer, more adaptive and efficient by improving the way robots predict human behaviour in shared industrial environments. It also identifies major system level changes, including the absence of standardised behavioural datasets and the need for models that can interpret human movement, intent and cognitive state.

As manufacturing moves toward Industry 5.0, production systems are becoming more human-centred, combining human creativity, judgement and dexterity with robotic precision, strength and speed. But that’s creating new safety and coordination challenges – if robots can’t accurately anticipate what a worker will do next, the risk of collisions, delays and inefficient collaboration increases.

For Monash, which is building a digital twin system enabling human and robot collaboration, the issue is more than theoretical. The university faced real safety concerns around ensuring people and robots operate together without risk.

It was that concern that prompted the review, published in the International Journal of Production Research. It outlines three key approaches – mechanism-based models built around physical motion and interaction rules, data-driven models that learn from sensors and AI and hybrid combinations – for predicting human behaviour during human-robot collaboration. It concludes that integrated models, combining physical world understanding with AI-driven data insights, will be necessary for the safety, efficiency and generalisation required in modern manufacturing systems.

Tang says the real barrier to Industry 5.0 is that robots still can’t reliably anticipate what humans will do next, and industry won’t change how it deploys robots until that prediction gap – not issues around power or speed – is closed.

Traditional solutions use rule-based boundaries to keep robots and humans separated. “We separate the robot and human in different spaces, and of course the human is much safer and feels more confident working with the robot,” Tang says.

But that approach limits the efficiency because in a lot of tasks having robot and human in the same space makes things easier and more efficient.

Removing those boundaries requires robots to handle unpredictable human behaviour, and Tang says industry demands absolute reliability before attempting this shift. The report says Industry 5.0 environments introduce new safety risks if robots cannot accurately anticipate the next human action.

“The conclusion we got from this research is we didn’t see any existing method or model that can solve all the problems. Every approach has its own shortcomings.”

The review suggests future progress will depend on combining physical models, sensor data and AI in ways that allow robots to respond more intelligently to human movement, intent and changing working conditions.

Key challenges, however, include the absence of standardised multimodal datasets, the limited scope of physical world models, the variability of human behaviour and the need to more effectively consider human trust, workload and cognitive state during collaboration.”

Closing the gap between research and deployment

Tang and his team at Monash have now embarked on a multi-year project to address the issue with the ultimate aim of releasing open-source base models and frameworks for industry to use.

“Our plan is that in the first one or two years, we get enough data. Then we’ll be training the model and validating the models and seeing whether it can help humans and robots have the better collaboration,” he says.

In Monash’s Living Lab, robot 3D printers and humans are working together. “We have a lot of sensors to detect the human actions and we want to build a data set to understand, for example that a human is now picking up items from the 3D printers, and then link to the next potential behaviour that the operator will do.”

The lab captures natural human motion, decision sequences and task patterns, giving researchers the multimodal data required to train predictive models.

Open-source base models and frameworks would be made available for industry to adopt – taking the algorithms and framework, but adding their own industry specific data.

“I hope in future, if this approach has been proved valid and we can build an open database, the more data we have, the more accurate, and much smarter robots can work with humans. That will lower the hurdle for industry to use these models.”

He says collecting data will be one headache for industries, because they will need to spend one or two years – “sometimes even longer” – to get enough data to train the models.

“I hope that in the future, like our large language model nowadays, we can train some foundation model and based on the foundation model, fine tune for the different industries. Maybe that will be the next step.”

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere