Use este identificador para citar ou linkar para este item:
Título: Human-robot interaction strategies for walker-assisted locomotion
Autor(es): Cifuentes García, Carlos Andrés
Orientador: Frizera Neto, Anselmo
Coorientador: Bastos Filho,Teodiano Freire
Carelli, Ricardo
Palavras-chave: Andador robótico
Interface Multimodal
Data do documento: 25-Jun-2015
Editor: Universidade Federal do Espírito Santo
Resumo: Neurological and age-related diseases affect human mobility at different levels causing partial or total loss of such faculty. There is a significant need to improve safe and efficient ambulation of patients with gait impairments. In this context, walkers present important benefits for human mobility, improving balance and reducing the load on their lower limbs. Most importantly, walkers induce the use of patient’s residual mobility capacities in different environments. In the field of robotic technologies for gait assistance, a new category of walkers has emerged, integrating robotic technology, electronics and mechanics. Such devices are known as “robotic walkers”, “intelligent walkers” or “smart walkers” One of the specific and important common aspects to the field of assistive technologies and rehabilitation robotics is the intrinsic interaction between the human and the robot. In this thesis, the concept of Human-Robot Interaction (HRI) for human locomotion assistance is explored. This interaction is composed of two interdependent components. On the one hand, the key role of a robot in a Physical HRI (pHRI) is the generation of supplementary forces to empower the human locomotion. This involves a net flux of power between both actors. On the other hand, one of the crucial roles of a Cognitive HRI (cHRI) is to make the human aware of the possibilities of the robot while allowing him to maintain control of the robot at all times. This doctoral thesis presents a new multimodal human-robot interface for testing and validating control strategies applied to a robotic walkers for assisting human mobility and gait rehabilitation. This interface extracts navigation intentions from a novel sensor fusion method that combines: (i) a Laser Range Finder (LRF) sensor to estimate the users legs’ kinematics, (ii) wearable Inertial Measurement Unit (IMU) sensors to capture the human and robot orientations and (iii) force sensors measure the physical interaction between the human’s upper limbs and the robotic walker. Two close control loops were developed to naturally adapt the walker position and to perform body weight support strategies. First, a force interaction controller generates velocity outputs to the walker based on the upper-limbs physical interaction. Second, a inverse kinematic controller keeps the walker within a desired position to the human improving such interaction. The proposed control strategies are suitable for natural human-robot interaction as shown during the experimental validation. Moreover, methods for sensor fusion to estimate the control inputs were presented and validated. In the experimental studies, the parameters estimation was precise and unbiased. It also showed repeatability when speed changes and continuous turns were performed.
Aparece nas coleções:PPGEE - Teses de doutorado

Arquivos associados a este item:
Arquivo TamanhoFormato 
tese_8979_[Cifuentes2015]Thesis20160322-161800.pdf19.45 MBAdobe PDFVisualizar/Abrir

Os itens no repositório estão protegidos por copyright, com todos os direitos reservados, salvo quando é indicado o contrário.