Sensors used in autonomous driving are affected variably in adverse weather conditions. The reliability of each sensor changes with different lighting, precipitation type and intensity, and visibility conditions. Common sensing modalities in autonomous driving include cameras and LiDARs, which have a high resolution signal, but degrade in poor weather conditions. Camera image contrast is also sensitive to changes in lighting. On the other hand, RADARs have a long range, but relatively low spatial resolution. Nonetheless, they are resilient to varying lighting and weather fluctuations. A compact weather detection system can be used to dynamically steer other algorithms used in autonomous vehicles (e.g., vehice control, object detection and tracking) toward the most reliable sensors. By adjusting the weights for different sensors, a fusion scheme can be made to change focus to the currently best performing sensor combination. Alternatively, a weather detection system could be used to switch between weather-specific models or ensemble algorithms. The idea of using multi-model algorithms for autonomous driving has been gaining popularity recently; a model trained in sunny conditions will likely underperform in non-ideal weather conditions. This paper presents a compact Convolutional Neural Network (CNN) that detects current driving weather conditions based on a narrow strip of the grayscale forward facing camera image. Standard multi-class classification metrics are used to assess the performance. The weighted average of the f1-score of all classes was 94%. The model was trained and tested on the RADIATE dataset, which contains multimodal sensor data of driving in different weather conditions including sunny, snow, overcast, fog, and rain.
Autonomous vehicles commonly include light detection and ranging (LiDAR) scanners in their suite of sensors. LiDARs are usually mounted on the vehicle’s roof to generate a point cloud of surrounding surfaces. In winter driving, falling snow casts randomly distributed shadow areas in the path of the LiDAR laser beam. This causes the scene to be obscured from the sensor to a degree proportional to the rate of snowfall, and snowflake size. In this paper, a post-processing model is developed for simulating the effect of synthesized snowfall on surrounding vehicle detection accuracy. This additive noise filter synthesizes the effect of falling snow in LiDAR data based on laser ray path and is applied to data from the popular, clear weather, KITTI road driving dataset. Object detection accuracy was quantified using metrics developed to study this effect, the mean and standard deviation of detections bounding box centroid error, the percentage error in detection bounding box volumes and the percentage of hidden original scene points in the shadow of synthetic noise points. These are important metrics an autonomous car needs to safely interact with other detected vehicles on the road. Object correlation between normal and noisy frames has been used to ensure the accuracy of the metrics as the noise introduced to the point cloud alters the number of detections. The simulation results show the effect of synthesized noise on the number and location of detections in each frame. The effect can be seen as lost detections in some cases. In others, it is present by introducing false positive detections in the scene. The testing at various noise levels also shows an increasing detection centroid mean error and bounding box volume percentage error with increasing noise. As the noise level increases, the point cloud in a frame grows with reflections of the synthesized snowflakes; however, the percentage of covered original LiDAR points shadowed by snowfall remain almost constant at a mean percentage value of 0.24%.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.