Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Appl Opt ; 62(10): 2470-2478, 2023 Apr 01.
Article in English | MEDLINE | ID: mdl-37132794

ABSTRACT

In this paper, the image motion introduced by the staring action itself during optical remote sensing satellites staring imaging with area-array detectors is studied. The image motion is decomposed into the angle-rotation image motion caused by the change of observation angle, the size-scaling image motion caused by the change of observation distance, and the Earth-rotation image motion caused by the rotation of the ground object with the Earth. The theoretical derivation of the angle-rotation image motion and size-scaling image motion is conducted, and the numerical analysis of the Earth-rotation image motion is carried out. Based on the comparison among the characteristics of the three types of image motions, the conclusion is drawn that, for general staring imaging scenes, the angle-rotation image motion is dominant, followed by the size-scaling image motion and the ignorable Earth-rotation image motion. On the condition that the image motion does not exceed 1 pixel, the allowed maximum exposure time for area-array staring imaging is analyzed. It is found that the large-array satellite is not suitable for long-exposure imaging, as its allowed exposure time decreases rapidly with the increase of roll angle. A satellite with a 12k×12k area-array detector and 500 km orbit is taken as an example. The allowed exposure time is 0.88 s when roll angle of the satellite is 0°; it decreases to 0.02 s when the roll angle increases to 28°.

2.
Sensors (Basel) ; 22(5)2022 Mar 03.
Article in English | MEDLINE | ID: mdl-35271142

ABSTRACT

Monitoring surface quality during machining has considerable practical significance for the performance of high-value products, particularly for their assembly interfaces. Surface roughness is the most important metric of surface quality. Currently, the research on online surface roughness prediction has several limitations. The effect of tool wear variation on surface roughness is seldom considered in machining. In addition, the deterioration trend of surface roughness and tool wear differs under variable cutting parameters. The prediction models trained under one set of cutting parameters fail when cutting parameters change. Accordingly, to timely monitor the surface quality of assembly interfaces of high-value products, this paper proposes a surface roughness prediction method that considers the tool wear variation under variable cutting parameters. In this method, a stacked autoencoder and long short-term memory network (SAE-LSTM) is designed as the fundamental surface roughness prediction model using tool wear conditions and sensor signals as inputs. The transfer learning strategy is applied to the SAE-LSTM such that the surface roughness online prediction under variable cutting parameters can be realized. Machining experiments for the assembly interface (using Ti6Al4V as material) of an aircraft's vertical tail are conducted, and monitoring data are used to validate the proposed method. Ablation studies are implemented to evaluate the key modules of the proposed model. The experimental results show that the proposed method outperforms other models and is capable of tracking the true surface roughness with time. Specifically, the minimum values of the root mean square error and mean absolute percentage error of the prediction results after transfer learning are 0.027 µm and 1.56%, respectively.

3.
Sensors (Basel) ; 17(5)2017 May 05.
Article in English | MEDLINE | ID: mdl-28475145

ABSTRACT

In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time.

SELECTION OF CITATIONS
SEARCH DETAIL
...