TY - GEN
T1 - CITR
T2 - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
AU - So, Peter
AU - Cabral Muchacho, Rafael I.
AU - Jeanne Kirschner, Robin
AU - Swikir, Abdalla
AU - Figueredo, Luis
AU - Abu-Dakka, Fares J.
AU - Haddadin, Sami
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The basis for robotics skill learning is an adequate representation of manipulation tasks based on their physical properties. As manipulation tasks are inherently invariant to the choice of reference frame, an ideal task representation would also exhibit this property. Nevertheless, most robotic learning approaches use unprocessed, coordinate-dependent robot state data for learning new skills, thus inducing challenges regarding the interpretability and transferability of the learned models.In this paper, we propose a transformation from spatial measurements to a coordinate-invariant feature space, based on the pairwise inner product of the input measurements. We describe and mathematically deduce the concept, establish the task fingerprints as an intuitive image-based representation, experimentally collect task fingerprints, and demonstrate the usage of the representation for task classification. This representation motivates further research on data-efficient and transferable learning methods for online manipulation task classification and task-level perception.
AB - The basis for robotics skill learning is an adequate representation of manipulation tasks based on their physical properties. As manipulation tasks are inherently invariant to the choice of reference frame, an ideal task representation would also exhibit this property. Nevertheless, most robotic learning approaches use unprocessed, coordinate-dependent robot state data for learning new skills, thus inducing challenges regarding the interpretability and transferability of the learned models.In this paper, we propose a transformation from spatial measurements to a coordinate-invariant feature space, based on the pairwise inner product of the input measurements. We describe and mathematically deduce the concept, establish the task fingerprints as an intuitive image-based representation, experimentally collect task fingerprints, and demonstrate the usage of the representation for task classification. This representation motivates further research on data-efficient and transferable learning methods for online manipulation task classification and task-level perception.
UR - http://www.scopus.com/inward/record.url?scp=85202438184&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85202438184&partnerID=8YFLogxK
U2 - 10.1109/ICRA57147.2024.10611312
DO - 10.1109/ICRA57147.2024.10611312
M3 - Conference contribution
AN - SCOPUS:85202438184
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 17501
EP - 17507
BT - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 13 May 2024 through 17 May 2024
ER -