Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Opt Express ; 28(2): 2263-2275, 2020 Jan 20.
Article in English | MEDLINE | ID: mdl-32121920

ABSTRACT

Digital projectors have been increasingly utilized in various commercial and scientific applications. However, they are prone to the out-of-focus blurring problem since their depth-of-fields are typically limited. In this paper, we explore the feasibility of utilizing a deep learning-based approach to analyze the spatially-varying and depth-dependent defocus properties of digital projectors. A multimodal displaying/imaging system is built for capturing images projected at various depths. Based on the constructed dataset containing well-aligned in-focus, out-of-focus, and depth images, we propose a novel multi-channel residual deep network model to learn the end-to-end mapping function between the in-focus and out-of-focus image patches captured at different spatial locations and depths. To the best of our knowledge, it is the first research work revealing that the complex spatially-varying and depth-dependent blurring effects can be accurately learned from a number of real-captured image pairs instead of being hand-crafted as before. Experimental results demonstrate that our proposed deep learning-based method significantly outperforms the state-of-the-art defocus kernel estimation techniques and thus leads to better out-of-focus compensation for extending the dynamic ranges of digital projectors.

2.
Appl Opt ; 58(12): 3238-3246, 2019 Apr 20.
Article in English | MEDLINE | ID: mdl-31044801

ABSTRACT

The fusion of three-dimensional (3D) geometrical and two-dimensional (2D) thermal information provides a promising method for characterizing temperature distribution of 3D objects, extending infrared imaging from 2D to 3D to support various thermal inspection applications. In this paper, we present an effective on-the-fly calibration approach for accurate alignment of depth and thermal data to facilitate dynamic and fast-speed 3D thermal scanning tasks. For each pair of depth and thermal frames, we estimate their relative pose by minimizing the objective function that measures the temperature consistency between a 2D infrared image and the reference 3D thermographic model. Our proposed frame-to-model mapping scheme can be seamlessly integrated into a generic 3D thermographic reconstruction framework. Through graphics-processing-unit-based acceleration, our method requires less than 10 ms to generate a pair of well-aligned depth and thermal images without hardware synchronization and improves the robustness of the system against significant camera motion.

3.
Opt Express ; 26(7): 8179-8193, 2018 Apr 02.
Article in English | MEDLINE | ID: mdl-29715787

ABSTRACT

Three-dimensional geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a Thermal-guided Iterative Closest Point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through effective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.

SELECTION OF CITATIONS
SEARCH DETAIL
...