Paper
1 November 1992 Robot free-flyers in space extravehicular activity
Harald J. Weigl, Harold L. Alexander
Author Affiliations +
Proceedings Volume 1829, Cooperative Intelligent Robotics in Space III; (1992) https://doi.org/10.1117/12.131722
Event: Applications in Optical Science and Engineering, 1992, Boston, MA, United States
Abstract
The Laboratory for Space Teleoperation and Robotics is developing a neutrally buoyant robot for research into the automatic and teleoperated (remote human) control of unmanned robotic vehicles for use in space. The goal of this project is to develop a remote robot with maneuverability and dexterity comparable to that of a space-suited astronaut with a manned maneuvering unit, able to assume many of the tasks currently planned for astronauts during extravehicular activity (EVA). Such a robot would be able to spare the great expense and hazards associated with human EVA, and make possible much less expensive scientific and industrialization exploitation of orbit. Both autonomous and teleoperated control experiments will require the vehicle to be able to automatically control its position and orientation. The laboratory has developed a real-time vision-based navigation and control system for its underwater space robot simulator, the Submersible for Telerobotic and Astronautical Research (STAR). The system, implemented with standard, inexpensive computer hardware, has excellent performance and robustness characteristics for a variety of applications, including automatic station-keeping and large controlled maneuvers. Experimental results are presented indicating the precision, accuracy, and robustness to disturbances of the vision-based control system. The study proves the feasibility of using vision-based control and navigation for remote robots and provides a foundation for developing a system for general space robot tasks. The complex vision sensing problem is reduced through linearization to a simple algorithm, fast enough to be incorporated into a real-time vehicle control system. Vision sensing is structured to detect small changes in vehicle position and orientation from a nominal positional state relative to a target scene. The system uses a constant, linear inversion matrix to measure the vehicle positional state from the locations of navigation features in an image. This paper includes a description of the underwater vehicle's vision-based navigation and control system and applications of vision-based navigation and control for free-flying space robots. Experimental results from underwater tests of STAR's vision system are also presented.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Harald J. Weigl and Harold L. Alexander "Robot free-flyers in space extravehicular activity", Proc. SPIE 1829, Cooperative Intelligent Robotics in Space III, (1 November 1992); https://doi.org/10.1117/12.131722
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Space robots

Control systems

Navigation systems

Vision-based navigation

Robotics

Stars

Automatic control

RELATED CONTENT

UAV flight path preview control technology
Proceedings of SPIE (September 02 2003)
Stability evaluation of the PUMA 560 robot arm under model...
Proceedings of SPIE (February 01 1991)
Telerobot Experiment Concepts In Space
Proceedings of SPIE (October 31 1987)
An Introduction to the Concept of Robot Factors And Its...
Proceedings of SPIE (December 23 1985)

Back to Top