Legged motion control

Dynamically stable gaits for quadruped walking vehicles

For dynamically stable locomotion in a quadruped robot, the integration of trajectory planning for body and leg positions (feedforward control) with adaptive control based on sensory feedback (feedback control) is essential. To address this, we propose a novel control technique for achieving a stable trot gait, termed the 3D Sway Compensation Trajectory. This method combines rotational and translational body movements along the diagonal axis connecting the supporting feet, with vertical swing motions of the swing-phase legs.

Vertical leg motion Stabilization of unsymmetrical trot gait Intermittent Trot Gait

Papers

Study on energy efficiency of quadruped walking vehicles

Although legged robots offer superior terrain adaptability compared to wheeled vehicles, their locomotion speed is generally lower. To achieve higher speeds, dynamically stable gaits—such as running in biped robots and trotting or bounding in quadruped robots—are considered promising. However, these dynamically stable gaits typically exhibit lower energy efficiency than more stable gaits like the crawl gait. In this study, we experimentally investigate the energy efficiency of a quadruped walking robot. Using a quadruped robot named TITAN-VIII, we compare the energy consumption of two variations of the trot gait. The experimental results demonstrate that the proposed 3D Sway Compensation Trajectory improves energy efficiency compared to the conventional sway compensation trajectory.

Papers

Stable gait control for a biped robot

The sway compensation trajectory, originally developed for dynamically stable walking in quadruped robots, has been adapted for application in bipedal robots. This approach facilitates the design of stable Zero Moment Point (ZMP) and Center of Gravity (COG) trajectories, traditionally considered complex and sensitive tasks. The effectiveness of the proposed method has been validated through both computer simulations and physical walking experiments using the humanoid robots HOAP-1 and HOAP-2.

Walking experiment Dance by HOAP Omni-directional walking motion
Step climbing Step climbing Dance step by fusing dynamically and statically stable walking

Papers

Straight legged walking for a biped robot

During human walking, the knee joints periodically extend and flex. However, replicating this motion in bipedal robots is challenging due to their limited degrees of freedom (DOFs). To address this, we propose a novel method for generating a straight-legged walking pattern in biped robots by leveraging the vertical motion of the upper body. First, we introduce two new indices: the Knee Stretch Index (KSI) and the Knee Torque Index (KTI), which quantify the efficiency of knee joint utilization. Then, an up-and-down motion of the upper body is automatically planned to optimize these indices, thereby enabling straight-legged walking. The fundamental concept of the proposed method is as follows:

  1. When many DOFs are needed to control the Zero Moment Point (ZMP), the robot lowers its body height.
  2. When surplus DOFs are available, the body is elevated and the knee joints are extended.
By extending the knee joints, a more human-like and natural walking motion is achieved. Additionally, this approach enhances energy efficiency, as the required torque and energy consumption for supporting body weight at the knee joints are reduced.

Straight Legged Walking Straight Legged Walking

Papers

Proactive Human Interface based on Embodied Agents

We are conducting research on an Embodied Proactive Human Interface, which aims to develop a human-friendly active interface based on two core technologies: a mechanism for estimating human intentions to enable natural communication—referred to as the Proactive Interface—and a tangible interface realized through robotic technology. As part of this research, we have developed a bipedal humanoid robot named PICO-2, which serves as a tangible telecommunication device for the proactive human interface. To enable embodied telecommunication using PICO-2, we propose a novel technique for tracking human gestures via a monocular video camera mounted on the robot, as well as natural gesture reproduction by PICO-2 that accounts for differences in body structure between the user and the robot. To evaluate the system, remote communication experiments were conducted using two PICO-2 robots placed at distant campus locations.

photo
PICO

PICO-2
photo
PICO-2
photo
PICO-2K
Mimic gestures ZMP balance control Mimic gestures Omnidirectional motion/td>
Motion tracking by monocular camera Remote communication experiments

Papers

  • Seiichi Uchida, Akihiro Mori, Ryo Kurazume, Rin-ichiro Taniguchi and Tsutomu Hasegawa,Logical DP Matching for Detecting Similar Subsequence, 8th Asian Conference on Computer Vision,November 2007.
  • Masato Nakajima, Seiichi Uchida, Akihiro Mori, Ryo Kurazume, Rin-ichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Motion Prediction Based on Eigen-Gestures, Proc. of the First Korea-Japan Joint Workshop on Pattern Recognition, pp.61-66, 2006.
  • Akiriho Mori, Seiichi Uchida, Ryo Kurazume, R.inichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Early Recognition and Prediction of Gestures toward Intelligent Man-Machine Interfaces, Proc. The Second Joint Workshop on Machine Perception and Robotics, CD-ROM, 2006.
  • Ryo Kurazume, Hiroaki Omasa, Seiichi Uchida, Rinichiro Taniguchi,Tsutomu Hasegawa, Embodied Proactive Human Interface ''PICO-2'', Proc. International Conference on Pattern Recognition, B04-0204, Aug 2006.
  • Akiriho Mori, Seiichi Uchida, Ryo Kurazume, R.inichiro Taniguchi, Tsutomu Hasegawa, and Hiroaki Sakoe, Early recognition and prediction of gestures, Proc. International Conference on Pattern Recognition, C02-0725, Aug 2006.
  • Yutaka Araki, Daisaku Arita, Rinichiro Taniguchi, Seiichi Uchida, Ryo Kurazume and Tsutomu Hasegawa, Construction of symbolic representation from human motion information, 10th Int. Conf. on Knowledge-Based & Intelligent Information & Engineering Systems (KES2006), 2006.
  • Rinichiro Taniguchi, Daisaku Arita, Seiichi Uchida, Ryo Kurazume, and Tsutomu Hasegawa, Human action sensing for proactive human interface: Computer vision approach ,Proceedings of International workshop on Processing Sensory Information for Proactive Systems (PSIPS 2004, Oulu, Finland)