19 October 2016 Motion blur estimation based on multitarget matching model
Victor Karnaukhov, Mikhail Mozerov
Author Affiliations +
Abstract
We propose a new method to estimate motion blur parameters based on the autocorrelation function of a blurred image. This blurred image is considered as a superposition of M shifted images identical to the original nonblurred image. In this case, convolution of the blurred image with itself can be considered as M2 pairwise convolutions, which contribute to the resultant autocorrelation function producing a distinguishable line corresponding to the estimated motion blur angle. The proposed method demonstrates the comparable accuracy of the motion blur angle estimation in comparison with state-of-the-art methods. Our method possesses lower computational complexity than popular accurate methods based on Radon transform. The proposed model also allows to accurately estimate motion blur length. Our results of length estimation, in general, outperform the accuracy of the methods based on Radon transform.
© 2016 Society of Photo-Optical Instrumentation Engineers (SPIE) 0091-3286/2016/$25.00 © 2016 SPIE
Victor Karnaukhov and Mikhail Mozerov "Motion blur estimation based on multitarget matching model," Optical Engineering 55(10), 100502 (19 October 2016). https://doi.org/10.1117/1.OE.55.10.100502
Published: 19 October 2016
Lens.org Logo
CITATIONS
Cited by 9 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Motion estimation

Motion models

Radon transform

Cameras

Convolution

Error analysis

Signal to noise ratio

Back to Top