全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Optical Flow Inversion for Remote Sensing Image Dense Registration and Sensor's Attitude Motion High-Accurate Measurement

DOI: 10.1155/2014/432613

Full-Text   Cite this paper   Add to My Lib

Abstract:

It has been discovered that image motions and optical flows usually become much more nonlinear and anisotropic in space-borne cameras with large field of view, especially when perturbations or jitters exist. The phenomenon arises from the fact that the attitude motion greatly affects the image of the three-dimensional planet. In this paper, utilizing the characteristics, an optical flow inversion method is proposed to treat high-accurate remote sensor attitude motion measurement. The principle of the new method is that angular velocities can be measured precisely by means of rebuilding some nonuniform optical flows. Firstly, to determine the relative displacements and deformations between the overlapped images captured by different detectors is the primary process of the method. A novel dense subpixel image registration approach is developed towards this goal. Based on that, optical flow can be rebuilt and high-accurate attitude measurements are successfully fulfilled. In the experiment, a remote sensor and its original photographs are investigated, and the results validate that the method is highly reliable and highly accurate in a broad frequency band. 1. Introduction For the remote sensors in dynamic imaging, one important technology is image motion compensation. Actually, to determine image motion velocity precisely is a very hard problem. In [1, 2], optical correlators are utilized to measure image motion in real time based on a sequence of mild smeared images with low exposure. This technique is appropriate to the situations in which the whole image velocity field is uniform. Some other blind motion estimation algorithms in [3–5] have been used to image postprocessing, which can roughly detect inhomogeneous image motion, but lack real-time performance because of complexity. As for space imaging, in order to avoid motion blurring, image motion velocity needs to be computed in real time according to the current physical information about spacecraft’s orbit and attitude motion, which can be obtained by the space-borne sensors, such as star trackers, gyroscopes, and GPS. Wang et al. developed a computational model for image motion vectors and presented error budget analysis in [6]. They focused on the small field of view (FOV) space cameras which are used in push-broom imaging with small attitude angles. In that situation, the nonlinearity of image motion velocity field does not appear significantly. However, for others with larger FOV, image motion velocity fields are definitely nonlinear and anisotropic because the geometry of the planet will

References

[1]  V. Tchernykh, M. Beck, and K. Janschek, “An embedded optical flow processor for visual navigation using optical correlator technology,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '06), pp. 67–72, Beijing, China, October 2006.
[2]  K. Janschek and V. Tchernykh, “Optical correlator for image motion compensation in the focal plane of a satellite camera,” in Proceedings of the 15th IFAC Symposium on Automatic Control in Aerospace, Bologna, Italy, 2001.
[3]  W. Priedhorsky and J. J. Bloch, “Optical detection of rapidly moving objects in space,” Applied Optics, vol. 44, no. 3, pp. 423–433, 2005.
[4]  T. Brox and J. Malik, “Large displacement optical flow: descriptor matching in variational motion estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 3, pp. 500–513, 2011.
[5]  B. Feng, P. P. Bruyant, P. H. Pretorius et al., “Estimation of the rigid-body motion from three-dimensional images using a generalized center-of-mass points approach,” IEEE Transactions on Nuclear Science, vol. 53, no. 5, pp. 2712–2718, 2006.
[6]  J. Wang, P. Yu, C. Yan, J. Ren, and B. He, “Space optical remote sensor image motion velocity vector computational modeling, error budget and synthesis,” Chinese Optics Letters, vol. 3, no. 7, pp. 414–417, 2005.
[7]  A. S. McEwen, M. E. Banks, N. Baugh, et al., “The high resolution imaging science experiment (HiRISE) during MRO’s primary science phase (PSP),” Icarus, vol. 205, no. 1, pp. 2–37, 2010.
[8]  F. Ayoub, S. Leprince, R. Binet, K. W. Lewis, O. Aharonson, and J.-P. Avouac, “Influence of camera distortions on satellite image registration and change detection applications,” in Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS '08), pp. II1072–II1075, Boston, Mass, USA, 2008.
[9]  S. Leprince, S. Barbot, F. Ayoub, and J.-P. Avouac, “Automatic and precise orthorectification, coregistration, and subpixel correlation of satellite images, application to ground deformation measurements,” IEEE Transactions on Geoscience and Remote Sensing, vol. 45, no. 6, pp. 1529–1558, 2007.
[10]  S. Leprince, P. Musé, and J.-P. Avouac, “In-flight CCD distortion calibration for pushbroom satellites based on subpixel correlation,” IEEE Transactions on Geoscience and Remote Sensing, vol. 46, no. 9, pp. 2675–2683, 2008.
[11]  Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Applied Optics, vol. 38, no. 20, pp. 4325–4332, 1999.
[12]  R. C. Hardie, K. J. Barnard, and R. Ordonez, “Fast super-resolution with affine motion using an adaptive wiener filter and its application to airborne imaging,” Optics Express, vol. 19, no. 27, pp. 26208–26231, 2011.
[13]  E. M. Blixt, J. Semeter, and N. Ivchenko, “Optical flow analysis of the aurora borealis,” IEEE Geoscience and Remote Sensing Letters, vol. 3, no. 1, pp. 159–163, 2006.
[14]  M. G. Mozerov, “Constrained optical flow estimation as a matching problem,” IEEE Transactions on Image Processing, vol. 22, no. 5, pp. 2044–2055, 2013.
[15]  H. Sakaino, “A semitransparency-based optical-flow method with a point trajectory model for particle-like video,” IEEE Transactions on Image Processing, vol. 21, no. 2, pp. 441–450, 2012.
[16]  D. Korsch, “Closed form solution for three-mirror telescopes, corrected for spherical aberration, coma, astigmatism, and field curvature,” Applied Optics, vol. 11, no. 12, pp. 2986–2987, 1972.
[17]  G. Naletto, V. da Deppo, M. G. Pelizzo, R. Ragazzoni, and E. Marchetti, “Optical design of the wide angle camera for the Rosetta mission,” Applied Optics, vol. 41, no. 7, pp. 1446–1453, 2002.
[18]  M. Born, E. Wolf, A. B. Bhatia, and P. C. Clemmow, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light, 7th edition, 1999.
[19]  H. Schaub and J. L. Junkins, Analytical Mechanics of Space Systems, AIAA Education Series, 2002.
[20]  C. Wang, F. Xing, J. H. Wang, and Z. You, “Optical flows method for lightweight agile remote sensor design and instrumentation,” in International Symposium on Photoelectronic Detection and Imaging, vol. 8908 of Proceeding of the SPIE, 2013.
[21]  T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors, vol. 13, no. 4, pp. 4598–4623, 2013.
[22]  T. Sun, F. Xing, Z. You, and M. Wei, “Motion-blurred star acquisition method of the star tracker under high dynamic conditions,” Optics Express, vol. 21, no. 17, pp. 20096–20110, 2013.
[23]  L. Younes, “Combining geodesic interpolating splines and affine transformations,” IEEE Transactions on Image Processing, vol. 15, no. 5, pp. 1111–1119, 2006.
[24]  B. Zitová and J. Flusser, “Image registration methods: a survey,” Image and Vision Computing, vol. 21, no. 11, pp. 977–1000, 2003.
[25]  Z. L. Song, S. Li, and T. F. George, “Remote sensing image registration approach based on a retrofitted SIFT algorithm and Lissajous-curve trajectories,” Optics Express, vol. 18, no. 2, pp. 513–522, 2010.
[26]  V. Arévalo and J. González, “Improving piecewise linear registration of high-resolution satellite images through mesh optimization,” IEEE Transactions on Geoscience and Remote Sensing, vol. 46, no. 11, pp. 3792–3803, 2008.
[27]  Z. Levi and C. Gotsman, “D-snake: image registration by as-similar-as-possible template deformation,” IEEE Transactions on Visualization and Computer Graphics, vol. 19, no. 2, pp. 331–343, 2013.
[28]  R. J. Althof, M. G. J. Wind, and J. T. Dobbins III, “A rapid and automatic image registration algorithm with subpixel accuracy,” IEEE Transactions on Medical Imaging, vol. 16, no. 3, pp. 308–316, 1997.
[29]  W. Tong, “Subpixel image registration with reduced bias,” Optics Letters, vol. 36, no. 5, pp. 763–765, 2011.
[30]  Y. Bentoutou, N. Taleb, K. Kpalma, and J. Ronsin, “An automatic image registration for applications in remote sensing,” IEEE Transactions on Geoscience and Remote Sensing, vol. 43, no. 9, pp. 2127–2137, 2005.
[31]  L. S. Ming, L. Yan, and L. Jindong, “Mapping satellite-1 transmission type photogrammetric and remote sensing,” Journal of Remote Sensing, vol. 16, supplement, pp. 10–16, 2012 (Chinese).
[32]  J. P. Lewis, “Fast template matching,” Vision Interface, vol. 95, pp. 120–123, 1995.
[33]  H. Foroosh, J. B. Zerubia, and M. Berthod, “Extension of phase correlation to subpixel registration,” IEEE Transactions on Image Processing, vol. 11, no. 3, pp. 188–200, 2002.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413