当前位置: X-MOL 学术J. Intell. Robot. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower
Journal of Intelligent & Robotic Systems ( IF 3.3 ) Pub Date : 2024-02-03 , DOI: 10.1007/s10846-023-02037-4
José Sarmento , Filipe Neves dos Santos , André Silva Aguiar , Vítor Filipe , António Valente

Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.



中文翻译:

基于飞行时间的传感器与单目摄像头的融合,用于机器人跟随者

人机协作 (HRC) 在工业和农​​业等先进生产系统中变得越来越重要。这种类型的协作可以通过减少人类的身体压力来提高生产力,从而减少伤害并提高士气。 HRC 的一个重要方面是机器人能够安全地跟随特定的人类操作员。为了应对这一挑战,提出了一种新颖的方法,该方法采用单目视觉和超宽带(UWB)收发器来确定人类目标相对于机器人的相对位置。 UWB 收发器能够使用 UWB 收发器跟踪人类,但会出现显着的角度误差。为了减少这种错误,使用具有深度学习对象检测功能的单目相机来检测人体。角度误差的减少是通过传感器融合实现的,即使用基于直方图的滤波器组合两个传感器的输出。该滤波器将两个源的测量值投影并相交到二维网格上。通过结合 UWB 和单目视觉,与单独使用 UWB 定位相比,角度误差显着降低了 66.67%。该方法在跟踪平均速度为 0.21 m/s 的人时,平均处理时间为 0.0183 秒,平均定位误差为 0.14 米。这种新颖的算法有望实现高效、安全的人机协作,为机器人领域做出宝贵的贡献。

更新日期:2024-02-03
down
wechat
bug