KEYWORDS: Personal digital assistants, Sun, Information fusion, Sensors, Facial recognition systems, Computing systems, Data fusion, Standards development, Environmental sensing, Global Positioning System
In order to effectively evaluate information fusion systems or emerging technologies, it is critical to quickly, efficient,
and accurately collect functional and observational data about such systems. One of the best ways to test a system's
capabilities is to have an end user operate it in controlled but realistic field-based situations. Evaluation data of the
systems' performance as well as observational data of the user's interactions can then be collected and analyzed. This
analysis often gives insight into how the system may perform in the intended environment and of any potential areas for
improvement. One common method for collection of this data involves an evaluator/observer generating hand-written
notes, comments, and sketches. This often proves to be inefficient in complex sensor technology field-based evaluation
environments. Personnel at the National Institute of Standards and Technology (NIST) have been tasked with collecting
such evaluation data for emerging soldier-worn sensor systems. Lessons learned from the on-going development of
efficient field-based evaluation data collection techniques will be discussed. The most recent evaluation data collection
using a personal digital assistant (PDA)-style system and details of its use during an evaluation of a multi-team study
will also be described.
Collecting accurate, adequate ground truth and experimental data to support technology evaluations is critical in
formulating exact and methodical analyses of the system's performance. Personnel at the National Institute of Standards
and Technology (NIST), tasked with developing performance measures and standards for both Urban Search and Rescue
(US&R) and bomb disposal robots, have been designing advanced ground truth data collection methods to support these
efforts. These new techniques fuse multiple real-time streams of video and robot tracking data to facilitate more
complete human robot interaction (HRI) analyses following a robot's experiences. As a robot maneuvers through a test
method, video and audio streams are simultaneously collected and fed into a quad compressor providing real-time
display. This fused quad audio/visual data provides a complete picture of what the operators and robots are doing
throughout their evaluation to not only enhance HRI analyses, but also provide valuable data that can be used to aid
operator training, encourage implementation improvements by highlighting successes and failures to the
developers/vendors, and demonstrate capabilities to end-users and buyers. Quad data collection system deployments to
support US&R test methods/scenarios at the 2007 Robot Response Evaluation in Disaster City, Texas will be
highlighted.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.