Through-wall radar imaging is an emerging technology with great interest to military and police forces operating
in an urban environment. A through-wall imaging radar can potentially provide interior room layouts as well
as detection and localization of targets of interest within a building. In this paper, we present our through-wall
radar system mounted on the side of a vehicle and driven along a path in front of a building of interest. The
vehicle is equipped with a LIDAR (Light Detection and Ranging) and motion sensors that provide auxiliary
information. The radar uses an ultra wideband frequency-modulated continuous wave (FMCW) waveform to
obtain high range resolution. Our system is composed of a vertical linear receive array to discriminate targets in
elevation, and two transmit elements operated in a slow multiple-input multiple output (MIMO) configuration
to increase the achievable elevation resolution. High resolution in the along-track direction is obtained through
synthetic aperture radar (SAR) techniques. We present experimental results that demonstrate the 3-D capability
of the radar. We further demonstrate target detection behind challenging walls, and imagery of internal wall
features. Finally, we discuss future work.
KEYWORDS: 3D modeling, 3D acquisition, 3D image processing, Data modeling, Clouds, Databases, Video, Detection and tracking algorithms, RGB color model, LIDAR
3D imagery has a well-known potential for improving situational awareness and battlespace visualization by
providing enhanced knowledge of uncooperative targets. This potential arises from the numerous advantages
that 3D imagery has to offer over traditional 2D imagery, thereby increasing the accuracy of automatic target
detection (ATD) and recognition (ATR). Despite advancements in both 3D sensing and 3D data exploitation,
3D imagery has yet to demonstrate a true operational gain, partly due to the processing burden of the massive
dataloads generated by modern sensors. In this context, this paper describes the current status of a workbench
designed for the study of 3D ATD/ATR. Among the project goals is the comparative assessment of algorithms
and 3D sensing technologies given various scenarios. The workbench is comprised of three components: a
database, a toolbox, and a simulation environment. The database stores, manages, and edits input data of
various types such as point clouds, video, still imagery frames, CAD models and metadata. The toolbox features
data processing modules, including range data manipulation, surface mesh generation, texture mapping, and
a shape-from-motion module to extract a 3D target representation from video frames or from a sequence of
still imagery. The simulation environment includes synthetic point cloud generation, 3D ATD/ATR algorithm
prototyping environment and performance metrics for comparative assessment. In this paper, the workbench
components are described and preliminary results are presented. Ladar, video and still imagery datasets collected
during airborne trials are also detailed.
Mapping the interior of buildings is of great interest to military forces operating in an urban battlefield. Throughwall
radars have the potential of mapping interior room layout, including the location of walls, doors and furniture.
They could provide information on the in-wall structure, and detect objects of interest concealed in buildings,
such as persons and arms caches. We are proposing to provide further context to the end user by fusing the
radar data with LIDAR (Light Detection and Ranging) images of the building exterior.
In this paper, we present our system concept of operation, which involves a vehicle driven along a path
in front of a building of interest. The vehicle is equipped with both radar and LIDAR systems, as well as a
motion compensation unit. We describe our ultra wideband through-wall L-band radar system which uses stretch
processing techniques to obtain high range resolution, and synthetic aperture radar (SAR) techniques to achieve
good azimuth resolution. We demonstrate its current 2-D capabilities with experimental data, and discuss the
current progress in using array processing in elevation to provide a 3-D image. Finally, we show preliminary
data fusion of SAR and LIDAR data.
The use of robots for (semi-) autonomous operations in complex terrains such as urban environments poses difficult mobility, mapping, and perception challenges. To be able to work efficiently, a robot should be provided with sensors and software such that it can perceive and analyze the world in 3D. Real-time 3D sensing and perception in this operational context are paramount. To address these challenges, DRDC Valcartier has developed over the past years a compact sensor that combines a wide baseline stereo camera and a laser scanner with a full 360 degree azimuth and 55 degree elevation field of view allowing the robot to view and manage overhang obstacles as well as obstacles at ground level. Sensing in 3D is common but to efficiently navigate and work in complex terrain, the robot should also perceive, decide and act in three dimensions. Therefore, 3D information should be preserved and exploited in all steps of the process. To achieve this, we use a multiresolution octree to store the acquired data, allowing mapping of large environments while keeping the representation compact and memory efficient. Ray tracing is used to build and update the 3D occupancy model. This model is used, via a temporary 2.5D map, for navigation, obstacle avoidance and efficient frontier-based exploration. This paper describes the volumetric sensor concept, describes its design features and presents an overview of the 3D software framework that allows 3D information persistency through all computation steps. Simulation and real-world experiments are presented at the end of the paper to demonstrate the key elements of our approach.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.