In this paper, we address the problematic of automatic detection of engagement in multi-party Human-Robot Interaction scenarios. The aim is to investigate to what extent are we able to infer the engagement of one of the entities of a group based solely on the cues of the other entities present in the interaction. In a scenario featuring 3 entities: 2 participants and a robot, we extract behavioural cues that concern each of the entities, we then build models based solely on each of these entities' cues and on combinations of them to predict the engagement level of each of the participants. Person-level cross validation shows that we are capable of detecting the engagement of the participant in question using solely the behavioural cues of the robot with a high accuracy compared to using the participant's cues himself (75.91% vs. 74.32%). Moreover using the behavioural cues of the other participant is also informative where it permits the detection of the engagement of the participant in question at an accuracy of 62.15% on average. The correlation between the features of the other participant with the engagement labels of the participant in question suggests a high cohesion between the two participants. In addition, the similarity of the most significantly correlated features among the two participants suggests a high synchrony between the two parties.