ABSTRACT
System requirements for many military electro-optic and IR camera systems reflect the need for both wide-field-of-view situational awareness as well as high-resolution imaging for target identification. In this work we present a new imaging system architecture designed to perform both functions simultaneously and the AWARE 10 camera as an example at visible wavelengths. We first describe the basic system architecture and user interface followed by a laboratory characterization of the system optical performance. We then describe a field experiment in which the camera was used to identify several maritime targets at varying range. The experimental results indicate that users of the system are able to correctly identify ~10 m targets at between 4 and 6 km with 70% accuracy.
ABSTRACT
Pixel count is the ratio of the solid angle within a camera's field of view to the solid angle covered by a single detector element. Because the size of the smallest resolvable pixel is proportional to aperture diameter and the maximum field of view is scale independent, the diffraction-limited pixel count is proportional to aperture area. At present, digital cameras operate near the fundamental limit of 1-10 megapixels for millimetre-scale apertures, but few approach the corresponding limits of 1-100 gigapixels for centimetre-scale apertures. Barriers to high-pixel-count imaging include scale-dependent geometric aberrations, the cost and complexity of gigapixel sensor arrays, and the computational and communications challenge of gigapixel image management. Here we describe the AWARE-2 camera, which uses a 16-mm entrance aperture to capture snapshot, one-gigapixel images at three frames per minute. AWARE-2 uses a parallel array of microcameras to reduce the problems of gigapixel imaging to those of megapixel imaging, which are more tractable. In cameras of conventional design, lens speed and field of view decrease as lens scale increases, but with the experimental system described here we confirm previous theoretical results suggesting that lens speed and field of view can be scale independent in microcamera-based imagers resolving up to 50 gigapixels. Ubiquitous gigapixel cameras may transform the central challenge of photography from the question of where to point the camera to that of how to mine the data.
Subject(s)
Photography/instrumentation , Photography/methods , Animals , Birds , Data Mining , Electronics/instrumentation , Lakes , Optical Phenomena , Optics and Photonics/instrumentation , Stars, Celestial , Time FactorsABSTRACT
We present a novel sensor that measures the entire spatial coherence function within an aperture by use of a variable astigmatic lens. This sensor permits digital capture and processing of partially coherent fields. We demonstrate the sensor by sampling and computing the coherent modes of a three-dimensional incoherent source.
ABSTRACT
We present a new type of optical wave-front sensor: the sampling field sensor (SFS). The SFS attempts to solve the problem of real-time optical phase detection. It has a high space-bandwidth product and can be made compact and vibration insensitive. We describe a particular implementation of this sensor and compare it, through numerical simulations, with a more mature technique based on the Shack-Hartmann wave-front sensor. We also present experimental results for SFS phase estimation. Finally, we discuss the advantages and drawbacks of this SFS implementation and suggest alternative implementations.
ABSTRACT
We use cubic-phase plate imaging to demonstrate an order-of-magnitude improvement in the transverse resolution of three-dimensional objects reconstructed by extended depth-of-field tomography. Our algorithm compensates for the range shear of the cubic-phase approach and uses camera rotation to center the reconstructed volume on a target object point.
ABSTRACT
We show that three-dimensional incoherent primary sources can be reconstructed from finite-aperture Fresnel-zone mutual intensity measurements by means of coordinate and Fourier transformation. The spatial bandpass and impulse response for three-dimensional imaging that result from use of this approach are derived. The transverse and longitudinal resolutions are evaluated as functions of aperture size and source distance. The longitudinal resolution of three-dimensional coherence imaging falls inversely with the square of the source distance in both the Fresnel and Fraunhofer zones. We experimentally measure the three-dimensional point-spread function by using a rotational shear interferometer.
ABSTRACT
We consider optical interferometric cross correlators based on broadband light sources. We derive the signal-to-noise ratio from basic principles and supply experimental evidence that corroborates the theoretical analysis. Noise sources are discussed, and the signal-to-noise ratio of our experimental system is measured.