An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.
A design for a low cost, wide field, low profile, light weight, full color, stereoscopic, see-through eyewear display called the personalviewer (PV) is presented. After a brief review of current methods and popular products, the principles of operation of the personal viewer are presented and several of the difficulties faced in implementation are discussed. Specifically, imaging is accomplished by covering each eye with a transparent semi-reflective half-ellipsoid, approximately 6cm in diameter and 7cm in length, with its lower focus at the user pupil and the upper focus at a two-axis scanner just above the user's brow. A 1mm collimated optical beam is directed onto the scanner mirror, from which a raster pattern is projected onto the inside surface of the ellipsoid and reflected into the user's pupil. The perceived source of the beam is the point of reflection, providing a 120 degree field breadth at each eye with a 60 degree overlap of the right and left fields where stereo fusion is available. Finally, a reduced bandwidth teleoperation support architecture is described where orientation sensors on the viewer control a dynamic mapping of the viewer display space onto a 360 degree image in response to user head motion.
KEYWORDS: Visual process modeling, Robotics, Control systems, Sensors, Cameras, Process control, Visualization, Data modeling, Systems modeling, Robotic systems
A brief survey of current autonomous vehicle (AV) projects is presented with intent to find common infrastructure or subsystems that can be configured from commercially available modular robotic components, thereby providing developers with greatly reduced timelines and costs and encouraging focus on the selected problem domain. The Modular Manipulator System (MMS) robotic system, based on single degree of freedom rotary and linear modules, is introduced and some approaches to autonomous vehicle configuration and deployment are examined. The modules may be configured to provide articulated suspensions for very rugged terrain and fall recovery, articulated sensors and tooling plus a limited capacity for self repair and self reconfiguration. The MMS on-board visually programmed control software (Model Manager) supports experimentation with novel physical configurations and behavior algorithms via real-time 3D graphics for operations simulation and provides useful subsystems for vision, learning and planning to host intelligent behavior.
A model is presented for predicting classification performance for systems having a large population of classes. The cases of large and small training set size for each class are treated separately. A method is proposed for measuring classification performance as the mean ranking statistic (rho) E which is derived from the average information content hE of the system feature vector, which is in turn derived from the system covariance matrices ((Sigma) W, (Sigma) B). This method for predicting (rho) E is applied to the large training set case (case 1), explaining why performance is not compromised but improved by adding noisy features. The method is extended for predicting performance in the more difficult small training set case (case 2), explaining why performance may be compromised by the addition of noisy features in that situation.
KEYWORDS: Systems modeling, Data modeling, Visualization, Computer programming, Visual process modeling, Composites, Motion models, 3D modeling, Intelligence systems, Visual programming languages
An experimental modeling language for general purpose simulation, robotic control and factory automation is presented. The advantages of a visual programming interface and data flow architecture are examined. Primarily, the model based organization is proposed as a means of integrating various intelligent systems disciplines to enhance problem solving abilities and improve system utility.
KEYWORDS: Process control, Control systems, Signal processing, Systems modeling, Robotic systems, Data modeling, Visual process modeling, Computer programming, Model-based design, Sensors
The Modulator Manipulator Systems (MMS) is a general purpose user configurable modular robotic system that provides rapid and inexpensive implementation of standard or customized manipulator geometries of arbitrary complexity, tailored to the needs of individual researchers or application engineers. Structures are configured from self contained 1- DOF rotary or linear JOINT modules, which include an on- board control processor, power amplifier, DC servomotor, high precision position sensor and a fast, rigid connect/disconnect latch. The joints are connected together by passive rigid LINK tubes, that define the manipulator geometry. These components are all offered in 5, 7, 10, 14 and 20 cm diameters, with power density and positional accuracy competitive with other commercial manipulators.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.