Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Opt Lett ; 49(10): 2593-2596, 2024 May 15.
Article in English | MEDLINE | ID: mdl-38748113

ABSTRACT

We propose a novel, to our knowledge, approach to address the limitations of traditional pancake lenses for virtual reality headsets, such as low image contrast and poor performance when eyes rotate. The design leverages the foveated nature of human vision, achieving a superior modulation transfer function in the foveal area to enhance optical performance significantly. Furthermore, the pancake lens design is presented that considers the rotation of the user's pupil position, maintaining optimal image quality even when the user's eye rotates. The proposed method presents the parameters and optimization of a novel pancake lens that utilizes the characteristics of the human visual system and accounts for the rotation of the pupil position of the user, leading to improvements in image quality and user experience. The lens design and image simulation results are presented to demonstrate the effectiveness of the approach.

2.
Opt Express ; 31(24): 39880-39892, 2023 Nov 20.
Article in English | MEDLINE | ID: mdl-38041301

ABSTRACT

Eye trackers play a crucial role in the development of future display systems, such as head-mounted displays and augmented reality glasses. However, ensuring robustness and accuracy in gaze estimation poses challenges, particularly with limited space available for the transmitter and receiver components within these devices. To address the issues, we propose what we believe is a novel eye tracker design mounted on foldable temples, which not only supports accurate gaze estimation but also provides slim form-factor and unobstructed vision. Our temple-mounted eye tracker utilizes a near-infrared imaging system and incorporates a patterned near-infrared mirror for calibration markers. We present wearable prototypes of the eye tracker and introduce a unique calibration and gaze extraction algorithm by considering the mirror's spatial reflectance distribution. The accuracy of gaze extraction is evaluated through tests involving multiple users with realistic scenarios. We conclude with an evaluation of the results and a comprehensive discussion on the applicability of the temple-mounted eye tracker.


Subject(s)
Augmented Reality , Smart Glasses , Eye Movements , Head , Calibration
3.
Opt Express ; 31(19): 30248-30266, 2023 Sep 11.
Article in English | MEDLINE | ID: mdl-37710571

ABSTRACT

We present a noise robust deep learning based aberration analysis method using 2-step phase shift measurement data. We first propose a realistic aberration pattern generation method to synthesize a sufficient amount of real-world-like aberration patterns for training a deep neural network by exploiting the asymptotic statistical distribution parameters of the real-world Zernike coefficients extracted from a finite number of experimentally measured real-world aberration patterns. As a result, we generate a real-world-like synthetic dataset of 200,000 different aberrations from 15 sets of real-world aberration patterns obtained by a Michelson interferometer under a variety of measurement conditions using the 4-step derivative fitting method together with the exploitation of the Gaussian density estimation. We then train the deep neural network with the real-world-like synthetic dataset, using two types of network architectures, GoogLeNet and ResNet101. By applying the proposed learning based 2-step aberration analysis method to the analysis of numerically generated aberrations formed under 100 different conditions, we verify that the proposed 2-step method can clearly outperform the existing 4-step iterative methods based on 4-step measurements, including the derivative fitting, transport of intensity equation (TIE), and robust TIE methods, in terms of noise robustness, root mean square error (RMSE), and inference time. By applying the proposed 2-step method to the analysis of the real-world aberrations experimentally obtained under a variety of measurement conditions, we also verify that the proposed 2-step method achieves compatible performance in terms of the RMSE between the reconstructed and measured aberration patterns, and also exhibits qualitative superiority in terms of reconstructing more realistic fringe patterns and phase distributions compared to the existing 4-step iterative methods. Since the proposed 2-step method can be extended to an even more general analysis of aberrations of any higher order, we expect that it will be able to provide a practical way for comprehensive aberration analysis and that further studies will extend its usefulness and improve its operational performance in terms of algorithm compactness, noise robustness, and computational speed.

4.
Biomed Opt Express ; 12(8): 5179-5195, 2021 Aug 01.
Article in English | MEDLINE | ID: mdl-34513250

ABSTRACT

Vision-correcting displays are key to achieving physical and physiological comforts to the users with refractive errors. Among such displays are holographic displays, which can provide a high-resolution vision-adaptive solution with complex wavefront modulation. However, none of the existing hologram rendering techniques have considered the optical properties of the human eye nor evaluated the significance of vision correction. Here, we introduce vision-correcting holographic display and hologram acquisition that integrates user-dependent prescriptions and a physical model of the optics, enabling the correction of on-axis and off-axis aberrations. Experimental and empirical evaluations of the vision-correcting holographic displays show the competence of holographic corrections over the conventional vision correction solutions.

5.
IEEE Trans Vis Comput Graph ; 25(5): 1928-1939, 2019 05.
Article in English | MEDLINE | ID: mdl-30794179

ABSTRACT

Traditional optical manufacturing poses a great challenge to near-eye display designers due to large lead times in the order of multiple weeks, limiting the abilities of optical designers to iterate fast and explore beyond conventional designs. We present a complete near-eye display manufacturing pipeline with a day lead time using commodity hardware. Our novel manufacturing pipeline consists of several innovations including a rapid production technique to improve surface of a 3D printed component to optical quality suitable for near-eye display application, a computational design methodology using machine learning and ray tracing to create freeform static projection screen surfaces for near-eye displays that can represent arbitrary focal surfaces, and a custom projection lens design that distributes pixels non-uniformly for a foveated near-eye display hardware design candidate. We have demonstrated untethered augmented reality near-eye display prototypes to assess success of our technique, and show that a ski-goggles form factor, a large monocular field of view (30o×55o), and a resolution of 12 cycles per degree can be achieved.

6.
J Biomed Opt ; 23(6): 1-11, 2018 06.
Article in English | MEDLINE | ID: mdl-29931838

ABSTRACT

Here, we present dual-dimensional microscopy that captures both two-dimensional (2-D) and light-field images of an in-vivo sample simultaneously, synthesizes an upsampled light-field image in real time, and visualizes it with a computational light-field display system in real time. Compared with conventional light-field microscopy, the additional 2-D image greatly enhances the lateral resolution at the native object plane up to the diffraction limit and compensates for the image degradation at the native object plane. The whole process from capturing to displaying is done in real time with the parallel computation algorithm, which enables the observation of the sample's three-dimensional (3-D) movement and direct interaction with the in-vivo sample. We demonstrate a real-time 3-D interactive experiment with Caenorhabditis elegans.


Subject(s)
Caenorhabditis elegans/cytology , Imaging, Three-Dimensional/instrumentation , Microscopy/methods , Algorithms , Animals , Computer Systems , Fourier Analysis , Motor Activity
7.
Opt Lett ; 41(12): 2751-4, 2016 Jun 15.
Article in English | MEDLINE | ID: mdl-27304280

ABSTRACT

In light field microscopy (LFM), the F-number of the micro lens array (MLA) should be matched with the image-side F-number of the objective lens to utilize full resolution of an image sensor. We propose a new F-number matching method that can be applied to multiple objective lenses by using an elastic MLA. We fabricate an elastic MLA with polydimethylsiloxane (PDMS) using a micro contact printing method and address the strain for the F-number variation. The strain response is analyzed, and the LFM system with the elastic MLA is demonstrated. Our proposed system can increase the F-number up to 27.3% and can be applied to multiple objective lenses.

8.
Opt Lett ; 41(11): 2486-9, 2016 Jun 01.
Article in English | MEDLINE | ID: mdl-27244395

ABSTRACT

A holographic display system for realizing a three-dimensional optical see-through augmented reality (AR) is proposed. A multi-functional holographic optical element (HOE), which simultaneously performs the optical functions of a mirror and a lens, is adopted in the system. In the proposed method, a mirror that is used to guide the light source into a reflection type spatial light modulator (SLM) and a lens that functions as Fourier transforming optics are recorded on a single holographic recording material by utilizing an angular multiplexing technique of volume hologram. The HOE is transparent and performs the optical functions just for Bragg matched condition. Therefore, the real-world scenes that are usually distorted by a Fourier lens or an SLM in the conventional holographic display can be observed without visual disturbance by using the proposed mirror-lens HOE (MLHOE). Furthermore, to achieve an optimized optical recording condition of the MLHOE, the optical characteristics of the holographic material are measured. The proposed holographic AR display system is verified experimentally.

9.
Appl Opt ; 54(35): 10333-41, 2015 Dec 10.
Article in English | MEDLINE | ID: mdl-26836855

ABSTRACT

In this paper, we develop a real-time depth controllable integral imaging system. With a high-frame-rate camera and a focus controllable lens, light fields from various depth ranges can be captured. According to the image plane of the light field camera, the objects in virtual and real space are recorded simultaneously. The captured light field information is converted to the elemental image in real time without pseudoscopic problems. In addition, we derive characteristics and limitations of the light field camera as a 3D broadcasting capturing device with precise geometry optics. With further analysis, the implemented system provides more accurate light fields than existing devices without depth distortion. We adapt an f-number matching method at the capture and display stage to record a more exact light field and solve depth distortion, respectively. The algorithm allows the users to adjust the pixel mapping structure of the reconstructed 3D image in real time. The proposed method presents a possibility of a handheld real-time 3D broadcasting system in a cheaper and more applicable way as compared to the previous methods.

10.
Opt Express ; 22(11): 13659-70, 2014 Jun 02.
Article in English | MEDLINE | ID: mdl-24921560

ABSTRACT

We propose an optical pseudoscopic to orthoscopic conversion method for integral imaging using a lens-array holographic optical element (LAHOE), which solves the pseudoscopic problem. The LAHOE reconstructs an array of diverging spherical waves when a probe wave with the phase-conjugated condition is imposed on it, while an array of converging spherical waves is reconstructed in ordinary reconstruction. For given pseudoscopic elemental images, the array of the diverging spherical waves integrates the orthoscopic three-dimensional images without a distortion. The principle of the proposed method is verified by the experiments of displaying the integral imaging on the LAHOE using computer generated and optically acquired elemental images.

11.
Opt Express ; 22(9): 10210-20, 2014 May 05.
Article in English | MEDLINE | ID: mdl-24921724

ABSTRACT

We propose a real-time integral imaging system for light field microscopy systems. To implement a 3D live in-vivo experimental environment for multiple experimentalists, we generate elemental images for an integral imaging system from the captured light field with a light field microscope in real-time. We apply the f-number matching method to generate an elemental image to reconstruct an undistorted 3D image. Our implemented system produces real and orthoscopic 3D images of micro objects in 16 frames per second. We verify the proposed system via experiments using Caenorhabditis elegans.


Subject(s)
Computer Systems , Diagnostic Imaging , Imaging, Three-Dimensional/instrumentation , Microscopy/instrumentation , Animals , Caenorhabditis elegans , Humans , Light
12.
Opt Express ; 21(23): 28758-70, 2013 Nov 18.
Article in English | MEDLINE | ID: mdl-24514388

ABSTRACT

We propose a depth-fused display (DFD) with enhanced viewing characteristics by hybridizing the depth-fusing technology with another three-dimensional display method such as multi-view or integral imaging method. With hybridization, the viewing angle and expressible depth range can be extended without changing the size of the volume of the system compared to the conventional DFD method. The proposed method is demonstrated with experimental system.

13.
J Nanosci Nanotechnol ; 6(11): 3647-51, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17252829

ABSTRACT

The uniformity and reproducibility of the photoresist nanopatterns fabricated using near-field scanning optical nanolithography (NSOL) are investigated. The nanopatterns could be used as nanomasks for pattern transfer on a silicon wafer. In the NSOL process, uniform patterning with high reproducibility is essential for reliable transfer of the mask patterns on a silicon substrate. Using an aperture type cantilever nanoprobe operated at contact mode and a positive photoresist, various nanopatterns are produced on thin photoresist layer coated on the silicon substrate. The size and shape variations of thereby produced patterns are investigated using atomic force microscope to determine their uniformity and reproducibility. It is demonstrated that the NSOL-produced photo-resist nanomasks can be successfully applied for silicon pattern transfer by fabricating a silicon nanochannel array.


Subject(s)
Nanotechnology/instrumentation , Nanotechnology/methods , Photochemistry/methods , Crystallization , Equipment Design , Lasers , Light , Microscopy, Atomic Force/instrumentation , Microscopy, Atomic Force/methods , Microscopy, Scanning Probe/instrumentation , Microscopy, Scanning Probe/methods , Nanoparticles/chemistry , Reproducibility of Results , Silicon/chemistry , Surface Properties
SELECTION OF CITATIONS
SEARCH DETAIL
...