Legged motion control

Dynamically stable gaits for quadruped walking vehicles

For dynamically stable walking for a quadruped walking robot, the combination of trajectory planning of body and leg positions (feedforward control) and the adaptive control using sensory information (feedback control) is indispensable. To this end we propose a new control technique for stable trot gait named the 3D sway compensation trajectory. This method utilizes rotational and translational body motion along a diagonal line between supporting feet and vertical swing motion of recovering legs.

Vertical leg motion Stabilization of unsymmetrical trot gait Intermittent Trot Gait

Papers

Study on energy efficiency of quadruped walking vehicles

Though a legged robot has high terrain adaptability compared with a wheeled vehicle, its moving speed is considerably low in general. For attaining high moving speed with a legged robot, dynamically stable gait, such as running for a biped robot and trot or bound gait for a quadruped robot, is a promising solution. However, the energy efficiency of dynamically stable gait is generally lower than the one of stable gait such as a crawl gait. We are conducting an experimental study on the energy efficiency of a quadruped walking vehicle. Energy consumption of two walking patterns for trot gait is investigated through experiments using a quadruped walking vehicle named TITAN-VIII. The obtained results show that the 3D sway compensation trajectory proposed has advantages in view of energy efficiency as compared with the original sway compensation trajectory.

Papers

Stable gait control for a biped robot

The sway compensation trajectory for dynamically stable walking, which was proposed originally for a quadruped robot, is extended for a biped robot. This method makes it quite easy to design stable ZMP and COG (center of gravity) trajectories, which have been regarded as very complicated and delicate problems. The effectiveness of the proposed method is verified through computer simulations and walking experiments by humanoid robots, HOAP-1 and HOAP-2.

Walking experiment Dance by HOAP Omni-directional walking motion
Step climbing Step climbing Dance step by fusing dynamically and statically stable walking

Papers

Straight legged walking for a biped robot

When human walks knee joints are stretched periodically. This motion is, however, very difficult for a biped robot due to the lack of degree-of-freedoms (DOFs). We propose a new methodology for generating a straight legged walking pattern for a biped robot utilizing up-and-down motion of an upper body. Firstly, we define two new indexes, the Knee Stretch Index (KSI) and the Knee Torque Index (KTI), which indicate how efficiently the knee joints are utilized. Next, up-and-down motion of the upper body is automatically planned so that these indexes are optimized and straight legged walking is realized. The basic idea of the proposed method is, i) when a large number of DOFs of motion are required for controlling the ZMP, a robot makes its body height lower, ii) when there is a extra number of DOFs of motion, the body is lifted and the knee joint is stretched. By stretching the knee joints, human-like natural walking motion is obtained. Moreover, energy efficiency is improved since required torque and energy consumption to support the body weight become small at knee joints.

Straight Legged Walking Straight Legged Walking

Papers

Proactive Human Interface based on Embodied Agents

We are conducting research on "Embodied Proactive Human Interface". The aim of this research is to develop a new human-friendly active interface based on two key technologies, an estimation mechanism of human intention for supporting natural communication named "Proactive Interface", and a tangible device using robot technology. We have developed the humanoid-type two-legged robot named "PICO-2", which was developed as a tangible telecommunication device for the proactive human interface. In order to achieve the embodied telecommunication with PICO-2, we propose new tracking technique of human gestures using a monocular video camera mounted on PICO-2, and natural gesture reproduction by PICO-2 which absorbs the difference of body structure between the user and the robot. Remote communication experiments were also carried out using two PICO-2 robots placed at distant campuses.

photo
PICO

PICO-2
photo
PICO-2
photo
PICO-2K
Mimic gestures ZMP balance control Mimic gestures Omnidirectional motion/td>
Motion tracking by monocular camera Remote communication experiments

Papers

  • Seiichi Uchida, Akihiro Mori, Ryo Kurazume, Rin-ichiro Taniguchi and Tsutomu Hasegawa,Logical DP Matching for Detecting Similar Subsequence, 8th Asian Conference on Computer Vision,November 2007.
  • Masato Nakajima, Seiichi Uchida, Akihiro Mori, Ryo Kurazume, Rin-ichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Motion Prediction Based on Eigen-Gestures, Proc. of the First Korea-Japan Joint Workshop on Pattern Recognition, pp.61-66, 2006.
  • Akiriho Mori, Seiichi Uchida, Ryo Kurazume, R.inichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Early Recognition and Prediction of Gestures toward Intelligent Man-Machine Interfaces, Proc. The Second Joint Workshop on Machine Perception and Robotics, CD-ROM, 2006.
  • Ryo Kurazume, Hiroaki Omasa, Seiichi Uchida, Rinichiro Taniguchi,Tsutomu Hasegawa, Embodied Proactive Human Interface ''PICO-2'', Proc. International Conference on Pattern Recognition, B04-0204, Aug 2006.
  • Akiriho Mori, Seiichi Uchida, Ryo Kurazume, R.inichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Early recognition and prediction of gestures, Proc. International Conference on Pattern Recognition, C02-0725, Aug 2006.
  • Yutaka Araki, Daisaku Arita, Rinichiro Taniguchi, Seiichi Uchida, Ryo Kurazume and Tsutomu Hasegawa, Construction of symbolic representation from human motion information, 10th Int. Conf. on Knowledge-Based & Intelligent Information & Engineering Systems (KES2006), 2006.
  • Rinichiro Taniguchi, Daisaku Arita, Seiichi Uchida, Ryo Kurazume, and Tsutomu Hasegawa, Human action sensing for proactive human interface: Computer vision approach ,Proceedings of International workshop on Processing Sensory Information for Proactive Systems (PSIPS 2004, Oulu, Finland)