In robot teleoperation scenarios, the interface between the user and the robot is undoubtedly of high importance. In this paper, electromyographic (EMG) signals from muscles of the human upper limb are used as the control interface between the user and a remote robot arm. The proposed interface consists of surface EMG electrodes, placed at the user's skin at several locations on the arm, letting the user's upper limb free of bulky interface sensors or machinery usually found in conventional teleoperation systems. The motion of the human upper limb entails the activation of a large number of muscles (i.e. more than 30 muscles, not including finger movements). Moreover, the human arm has 7 degrees of freedom (DoFs) suggesting a wide variety of motions. Therefore, the mapping between these two high-dimensional data (i.e. the muscles activation and the motion of the human arm), is an extremely challenging issue. For this reason, a novel methodology is proposed here, where the mapping between the muscles activation and the motion of the user's arm is done in a low-dimensional space. Each of the high-dimensional input (muscle activation) and output (arm motion) vectors, is transformed into an individual low-dimensional space, where the mapping between the two low-dimensional vectors is then feasible. A state-space model is trained to map the low-dimensional representation of the muscles activation to the corresponding motion of the user's arm. After training, the state-space model can decode the human arm motion in real time with high accuracy, using only EMG recordings. The estimated motion is used to control a remote anthropomorphic robot arm. The accuracy of the proposed method is assessed through real-time experiments including motion in two-dimensional (2D) space.