TY - GEN
T1 - Vision-based articulated machine pose estimation for excavation monitoring and guidance
AU - Feng, C.
AU - Dong, S.
AU - Lundeen, K. M.
AU - Xiao, Y.
AU - Kamat, V. R.
PY - 2015
Y1 - 2015
N2 - The pose of an articulated machine includes the position and orientation of not only the machine base (e.g., tracks or wheels), but also each of its major articulated components (e.g., stick and bucket). The ability to automatically estimate this pose is a crucial component of technical innovations aimed at improving both safety and productivity in many construction tasks. A computer vision based solution using a network of cameras and markers is proposed in this research to enable such a capability for articulated machines. Firstly, a planar marker is magnetically mounted on the end-effector of interest. Another marker is fixed on the jobsite whose 3D pose is pre-surveyed in a project coordinate frame. Then a cluster of at least two cameras respectively observing and tracking the two markers simultaneously forms a camera-marker network and transfers the end-effector's pose into the desired project frame, based on a pre-calibration of the relative poses between each pair of cameras. Through extensive sets of uncertainty analyses and field experiments, this approach is shown to be able to achieve centimeter level depth tracking accuracy within up to 15 meters with only two ordinary cameras (1.1 megapixel each) and a few markers, providing a flexible and cost-efficient alternative to other commercial products that use infrastructure dependent sensors like GPS. A working prototype has been tested on several active construction sites with positive feedback from excavator operators confirming the solution's effectiveness.
AB - The pose of an articulated machine includes the position and orientation of not only the machine base (e.g., tracks or wheels), but also each of its major articulated components (e.g., stick and bucket). The ability to automatically estimate this pose is a crucial component of technical innovations aimed at improving both safety and productivity in many construction tasks. A computer vision based solution using a network of cameras and markers is proposed in this research to enable such a capability for articulated machines. Firstly, a planar marker is magnetically mounted on the end-effector of interest. Another marker is fixed on the jobsite whose 3D pose is pre-surveyed in a project coordinate frame. Then a cluster of at least two cameras respectively observing and tracking the two markers simultaneously forms a camera-marker network and transfers the end-effector's pose into the desired project frame, based on a pre-calibration of the relative poses between each pair of cameras. Through extensive sets of uncertainty analyses and field experiments, this approach is shown to be able to achieve centimeter level depth tracking accuracy within up to 15 meters with only two ordinary cameras (1.1 megapixel each) and a few markers, providing a flexible and cost-efficient alternative to other commercial products that use infrastructure dependent sensors like GPS. A working prototype has been tested on several active construction sites with positive feedback from excavator operators confirming the solution's effectiveness.
KW - Bundle adjustment
KW - Camera-marker network
KW - Excavation guidance
KW - Pose estimation
KW - Uncertainty analysis
UR - http://www.scopus.com/inward/record.url?scp=85088743530&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85088743530&partnerID=8YFLogxK
U2 - 10.22260/isarc2015/0029
DO - 10.22260/isarc2015/0029
M3 - Conference contribution
AN - SCOPUS:85088743530
T3 - 32nd International Symposium on Automation and Robotics in Construction and Mining: Connected to the Future, Proceedings
BT - 32nd International Symposium on Automation and Robotics in Construction and Mining
PB - International Association for Automation and Robotics in Construction I.A.A.R.C)
T2 - 32nd International Symposium on Automation and Robotics in Construction and Mining: Connected to the Future, ISARC 2015
Y2 - 15 June 2015 through 18 June 2015
ER -