Pushing robots beyond traditional “safe but slow” parameters requires sensing and perception capabilities that can adapt in real time. Where cobots once leaned heavily on restricted force outputs and mechanical compliance, the next generation must operate at higher speeds without compromising human safety. This leap calls for sensor systems with wider coverage, faster update rates, and tighter integration with robust computing platforms. By continuously tracking proximity of objects and humans, predicting motion trajectories, and distinguishing humans from objects and the static space, these advanced systems can prevent hazardous collisions in dynamic, shared environments.
Humanoid robots add another dimension to this challenge. Their anthropomorphic design allows them to function in spaces built for humans, but it also places them in unpredictable surroundings, from busy factory floors to everyday public settings. Achieving safe operation in these complex environments requires not just more sensors, but also more intelligent sensor fusion. Coupled with deterministic algorithms that process data from multiple spectrums, robots can gain reliable situational awareness—even when vision is obscured by fog, bright light, or occlusion. Additionally, flexible “skin-like” sensors ensure large-area coverage without adding excessive weight, enabling robots to move at speeds closer to their mechanical potential.
In our published review paper in IEEE Sensors Journal, we summarize key sensor and computing imperatives that will power the next wave of swift yet safe robots, whether wheeled, industrial, or humanoid, as we approach Industry 5.0:
- Higher frame rates (90 Hz or more) in improved proximity sensors enable fast yet safe human–robot interactions through real-time responsiveness avoiding collisions in fast interactions.
- Dynamic, parallel computing is needed to process high-volume, low-latency data streams, across extensive sensor arrays. Integrating adaptive sensing mechanisms allows adjustable frame rates in critical zones, like the human–robot handover areas.
- Flexible, skin-like sensors can deliver high-resolution and lightweight sensors that cover a large area, avoiding compromising the robot’s payload capacity.
- Blending signals of different sensors leads to smarter detection and differentiation of humans and objects and increase the robustness and fault tolerance under varying conditions (fog or occlusion).
- Deterministic algorithms across multiple sensors in various spectra help understand spatial information and create a comprehensive awareness of the robot area, in a more reliable way than algorithms solely based on large annotated datasets. This approach enables a clear distinction between static and dynamic objects and their types but it also enables effective tracking within the environment. The massive amounts of sensor data, collected on the workfloor, should be fed into new AI-based foundational models for more sophisticated spatial interpretations.
- Standards offer a regulatory framework and standardized safety indices, essential for advancing human robot collaboration solutions.
imec and Vrije Universiteit Brussel (VUB) are jointly developing an advanced safety perception system that enables robots to operate at higher speeds without compromising human safety. By combining real-time sensor fusion, target-driven perception, and predictive motion analysis, this system enhances situational awareness, ensuring safe and efficient human-robot collaboration in dynamic environments.

Constantin Scholz is a roboticist at imec and Vrije Universiteit Brussel (VUB), and leads SAFEBOT, a venture in the spin-out phase developing safety perception technologies for physical AI. His mission is to enable fast robots in any form factor - without cages or slow cycle times - by combining multi-spectrum perception, real-time sensor fusion, and embedded safety intelligence.
Published on:
12 May 2025