Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
Sci Rep ; 13(1): 8443, 2023 May 25.
Article in English | MEDLINE | ID: mdl-37231018

ABSTRACT

Dempster-Shafer evidence theory is an effective method to deal with information fusion. However, how to deal with the fusion paradoxes while using the Dempster's combination rule is still an open issue. To address this issue, a new basic probability assignment (BPA) generation method based on the cosine similarity and the belief entropy was proposed in this paper. Firstly, Mahalanobis distance was used to measure the similarity between the test sample and BPA of each focal element in the frame of discernment. Then, cosine similarity and belief entropy were used respectively to measure the reliability and uncertainty of each BPA to make adjustments and generate a standard BPA. Finally, Dempster's combination rule was used for the fusion of new BPAs. Numerical examples were used to prove the effectiveness of the proposed method in solving the classical fusion paradoxes. Besides, the accuracy rates of the classification experiments on datasets were also calculated to verify the rationality and efficiency of the proposed method.

2.
Sci Rep ; 13(1): 7609, 2023 May 10.
Article in English | MEDLINE | ID: mdl-37165012

ABSTRACT

Uncertain information processing is a key problem in classification. Dempster-Shafer evidence theory (D-S evidence theory) is widely used in uncertain information modelling and fusion. For uncertain information fusion, the Dempster's combination rule in D-S evidence theory has limitation in some cases that it may cause counterintuitive fusion results. In this paper, a new correlation belief function is proposed to address this problem. The proposed method transfers the belief from a certain proposition to other related propositions to avoid the loss of information while doing information fusion, which can effectively solve the problem of conflict management in D-S evidence theory. The experimental results of classification on the UCI dataset show that the proposed method not only assigns a higher belief to the correct propositions than other methods, but also expresses the conflict among the data apparently. The robustness and superiority of the proposed method in classification are verified through experiments on different datasets with varying proportion of training set.

3.
Sensors (Basel) ; 23(9)2023 Apr 30.
Article in English | MEDLINE | ID: mdl-37177622

ABSTRACT

Environmental stability technology plays an important role in improving the adaptive range, image resolution and ensuring the stability of geometric parameters of aerial mapping camera. Traditional environmental stability methods directly implement active and passive thermal design to optical systems, which is easy to lead to radial temperature difference of optical components, and cannot eliminate the influence of pressure change. To solve the above problem, a method of environment stability design based on multi-dimensional structure is proposed. Firstly, the aerial mapping camera is designed as imaging system component (core) and sealing cylinder (periphery), and a sealed air insulation sandwich is formed between the two parts to realize the sealing design. A thermal interface is reserved outside the seal to avoid the radial thermal stress caused by direct heating of the optical parts, and a multi-dimensional Environmental stability structure is formed. Secondly, the core and the external thermal environment of aerial mapping camera in complex aviation environment are modeled and theoretically analyzed. Finally, the effectiveness and stability of the multi-dimensional structure method is verified by the thermal simulation and the flight. The results show that the thermal control power is 240 W, the thermal gradient of the optical system is less than 5 °C, the radial temperature difference is less than 0.5 °C. High quality image and ground measurement accuracy are obtained. Compared with tradition thermal control methods, the proposed method has the advantages of accuracy and low power consumption, which can effectively reduce the power consumption and difficulty of the thermal control.

4.
Entropy (Basel) ; 25(5)2023 May 06.
Article in English | MEDLINE | ID: mdl-37238514

ABSTRACT

Failure mode and effects analysis (FMEA) is a proactive risk management approach. Risk management under uncertainty with the FMEA method has attracted a lot of attention. The Dempster-Shafer (D-S) evidence theory is a popular approximate reasoning theory for addressing uncertain information and it can be adopted in FMEA for uncertain information processing because of its flexibility and superiority in coping with uncertain and subjective assessments. The assessments coming from FMEA experts may include highly conflicting evidence for information fusion in the framework of D-S evidence theory. Therefore, in this paper, we propose an improved FMEA method based on the Gaussian model and D-S evidence theory to handle the subjective assessments of FMEA experts and apply it to deal with FMEA in the air system of an aero turbofan engine. First, we define three kinds of generalized scaling by Gaussian distribution characteristics to deal with potential highly conflicting evidence in the assessments. Then, we fuse expert assessments with the Dempster combination rule. Finally, we obtain the risk priority number to rank the risk level of the FMEA items. The experimental results show that the method is effective and reasonable in dealing with risk analysis in the air system of an aero turbofan engine.

5.
Entropy (Basel) ; 25(3)2023 Mar 06.
Article in English | MEDLINE | ID: mdl-36981350

ABSTRACT

Dempster-Shafer evidence theory is widely used to deal with uncertain information by evidence modeling and evidence reasoning. However, if there is a high contradiction between different pieces of evidence, the Dempster combination rule may give a fusion result that violates the intuitive result. Many methods have been proposed to solve conflict evidence fusion, and it is still an open issue. This paper proposes a new reliability coefficient using betting commitment evidence distance in Dempster-Shafer evidence theory for conflict and uncertain information fusion. The single belief function for belief assignment in the initial frame of discernment is defined. After evidence preprocessing with the proposed reliability coefficient and single belief function, the evidence fusion result can be calculated with the Dempster combination rule. To evaluate the effectiveness of the proposed uncertainty measure, a new method of uncertain information fusion based on the new evidence reliability coefficient is proposed. The experimental results on UCI machine learning data sets show the availability and effectiveness of the new reliability coefficient for uncertain information processing.

6.
Entropy (Basel) ; 24(11)2022 Nov 02.
Article in English | MEDLINE | ID: mdl-36359686

ABSTRACT

Dempster-Shafer evidence theory is widely used in modeling and reasoning uncertain information in real applications. Recently, a new perspective of modeling uncertain information with the negation of evidence was proposed and has attracted a lot of attention. Both the basic probability assignment (BPA) and the negation of BPA in the evidence theory framework can model and reason uncertain information. However, how to address the uncertainty in the negation information modeled as the negation of BPA is still an open issue. Inspired by the uncertainty measures in Dempster-Shafer evidence theory, a method of measuring the uncertainty in the negation evidence is proposed. The belief entropy named Deng entropy, which has attracted a lot of attention among researchers, is adopted and improved for measuring the uncertainty of negation evidence. The proposed measure is defined based on the negation function of BPA and can quantify the uncertainty of the negation evidence. In addition, an improved method of multi-source information fusion considering uncertainty quantification in the negation evidence with the new measure is proposed. Experimental results on a numerical example and a fault diagnosis problem verify the rationality and effectiveness of the proposed method in measuring and fusing uncertain information.

7.
Opt Express ; 30(13): 24084-24102, 2022 Jun 20.
Article in English | MEDLINE | ID: mdl-36225077

ABSTRACT

With the presence of complex background noise, parasitic light, and dust attachment, it is still a challenging issue to perform high-precision laser-induced damage change detection of optical elements in the captured optical images. For resolving this problem, this paper presents an end-to-end damage change detection model based on siamese network and multi-layer perceptrons (SiamMLP). Firstly, representative features of bi-temporal damage images are efficiently extracted by the cascaded multi-layer perceptron modules in the siamese network. After that, the extracted features are concatenated and then classified into changed and unchanged classes. Due to its concise architecture and strong feature representation ability, the proposed method obtains excellent damage change detection results efficiently and effectively. To address the unbalanced distribution of hard and easy samples, a novel metric called hard metric is introduced in this paper for quantitatively evaluating the classification difficulty degree of the samples. The hard metric assigns a classification difficulty for each individual sample to precisely adjust the loss assigned to the sample. In the training stage, a novel hard loss is presented to train the proposed model. Cooperating with the hard metric, the hard loss can up-weight the loss of hard samples and down-weight the loss of easy samples, which results in a more powerful online hard sample mining ability of the proposed model. The experimental results on two real datasets validate the effectiveness and superiority of the proposed method.

8.
Sensors (Basel) ; 21(22)2021 Nov 11.
Article in English | MEDLINE | ID: mdl-34833574

ABSTRACT

Population based search techniques have been developed and applied to wide applications for their good performance, such as the optimization of the unmanned aerial vehicle (UAV) path planning problems. However, the search for optimal solutions for an optimization problem is usually expensive. For example, the UAV problem is a large-scale optimization problem with many constraints, which makes it hard to get exact solutions. Especially, it will be time-consuming when multiple UAV problems are waiting to be optimized at the same time. Evolutionary multi-task optimization (EMTO) studies the problem of utilizing the population-based characteristics of evolutionary computation techniques to optimize multiple optimization problems simultaneously, for the purpose of further improving the overall performance of resolving all these problems. EMTO has great potential in solving real-world problems more efficiently. Therefore, in this paper, we develop a novel EMTO algorithm using a classical PSO algorithm, in which the developed knowledge transfer strategy achieves knowledge transfer between task by synthesizing the transferred knowledges from a selected set of component tasks during the updating of the velocities of population. Two knowledge transfer strategies are developed along with two versions of the proposed algorithm. The proposed algorithm is compared with the multifactorial PSO algorithm, the SREMTO algorithm, the popular multifactorial evolutionary algorithm and a classical PSO algorithm on nine popular single-objective MTO problems and six five-task MTO problems, which demonstrates its superiority.

9.
Sensors (Basel) ; 21(17)2021 Sep 02.
Article in English | MEDLINE | ID: mdl-34502792

ABSTRACT

Deep neural networks have achieved significant development and wide applications for their amazing performance. However, their complex structure, high computation and storage resource limit their applications in mobile or embedding devices such as sensor platforms. Neural network pruning is an efficient way to design a lightweight model from a well-trained complex deep neural network. In this paper, we propose an evolutionary multi-objective one-shot filter pruning method for designing a lightweight convolutional neural network. Firstly, unlike some famous iterative pruning methods, a one-shot pruning framework only needs to perform filter pruning and model fine-tuning once. Moreover, we built a constraint multi-objective filter pruning problem in which two objectives represent the filter pruning ratio and the accuracy of the pruned convolutional neural network, respectively. A non-dominated sorting-based evolutionary multi-objective algorithm was used to solve the filter pruning problem, and it provides a set of Pareto solutions which consists of a series of different trade-off pruned models. Finally, some models are uniformly selected from the set of Pareto solutions to be fine-tuned as the output of our method. The effectiveness of our method was demonstrated in experimental studies on four designed models, LeNet and AlexNet. Our method can prune over 85%, 82%, 75%, 65%, 91% and 68% filters with little accuracy loss on four designed models, LeNet and AlexNet, respectively.


Subject(s)
Algorithms , Neural Networks, Computer , Biological Evolution
10.
Sensors (Basel) ; 21(13)2021 Jun 24.
Article in English | MEDLINE | ID: mdl-34202672

ABSTRACT

A single-layer ±45° dual-polarized directional array antenna for millimeter wave (mm-wave) applications is designed in this communication. Based on the theory of orthogonal circularly polarized (CP) wave multiplexing, two ports of a series-fed dual CP array are fed with equal amplitudes, and the array can radiate a linearly polarized wave with ±45° polarization orientations through the adjustment of the feeding phase difference. As the two ports of the series-fed array are simultaneously excited, the antenna can achieve directional radiation. In addition, the cross-polarization level of the array can be effectively suppressed by placing two series-fed arrays side by side. A prototype of the designed array antenna operating at 30 GHz is fabricated and measured; the working bandwidth of the proposed antenna is approximately 3.5%. Owing to its simple structure and directional radiation, the proposed antenna array is a competitive candidate for mm-wave applications.

11.
Sensors (Basel) ; 21(3)2021 Jan 28.
Article in English | MEDLINE | ID: mdl-33525527

ABSTRACT

Deep neural networks have evolved significantly in the past decades and are now able to achieve better progression of sensor data. Nonetheless, most of the deep models verify the ruling maxim in deep learning-bigger is better-so they have very complex structures. As the models become more complex, the computational complexity and resource consumption of these deep models are increasing significantly, making them difficult to perform on resource-limited platforms, such as sensor platforms. In this paper, we observe that different layers often have different pruning requirements, and propose a differential evolutionary layer-wise weight pruning method. Firstly, the pruning sensitivity of each layer is analyzed, and then the network is compressed by iterating the weight pruning process. Unlike some other methods that deal with pruning ratio by greedy ways or statistical analysis, we establish an optimization model to find the optimal pruning sensitivity set for each layer. Differential evolution is an effective method based on population optimization which can be used to address this task. Furthermore, we adopt a strategy to recovery some of the removed connections to increase the capacity of the pruned model during the fine-tuning phase. The effectiveness of our method has been demonstrated in experimental studies. Our method compresses the number of weight parameters in LeNet-300-100, LeNet-5, AlexNet and VGG16 by 24×, 14×, 29× and 12×, respectively.

12.
IEEE Trans Cybern ; 51(7): 3802-3812, 2021 Jul.
Article in English | MEDLINE | ID: mdl-30951491

ABSTRACT

Retrieving nearest neighbors across correlated data in multiple modalities, such as image-text pairs on Facebook and video-tag pairs on YouTube, has become a challenging task due to the huge amount of data. Multimodal hashing methods that embed data into binary codes can boost the retrieving speed and reduce storage requirements. Since unsupervised multimodal hashing methods are usually inferior to supervised ones and the supervised ones require too much manually labeled data, the proposed method in this paper utilizes partially available labels to design a semisupervised multimodal hashing method. The labels for unlabeled data are treated as an interval type-2 fuzzy set and estimated by the available labels. By defuzzifying the estimated labels using hard partitioning, a supervised multimodal hashing method is used to generate binary codes. Experiments show that the proposed semisupervised method with 50% labels can get a medium performance among the compared supervised ones and achieve approximate performance to the best supervised method with 90% labels. With only 10% labels, the proposed method can still compete with the worst compared supervised one. Furthermore, the proposed label estimation method has been experimentally proven to be more feasible for a multilabeled MIRFlickr data set in a hash lookup task.

13.
Sensors (Basel) ; 20(24)2020 Dec 09.
Article in English | MEDLINE | ID: mdl-33316906

ABSTRACT

Domain adaptation aims to handle the distribution mismatch of training and testing data, which achieves dramatic progress in multi-sensor systems. Previous methods align the cross-domain distributions by some statistics, such as the means and variances. Despite their appeal, such methods often fail to model the discriminative structures existing within testing samples. In this paper, we present a sample-guided adaptive class prototype method, which consists of the no distribution matching strategy. Specifically, two adaptive measures are proposed. Firstly, the modified nearest class prototype is raised, which allows more diversity within same class, while keeping most of the class wise discrimination information. Secondly, we put forward an easy-to-hard testing scheme by taking into account the different difficulties in recognizing target samples. Easy samples are classified and selected to assist the prediction of hard samples. Extensive experiments verify the effectiveness of the proposed method.

14.
Sensors (Basel) ; 20(20)2020 Oct 16.
Article in English | MEDLINE | ID: mdl-33081365

ABSTRACT

Distribution mismatch caused by various resolutions, backgrounds, etc. can be easily found in multi-sensor systems. Domain adaptation attempts to reduce such domain discrepancy by means of different measurements, e.g., maximum mean discrepancy (MMD). Despite their success, such methods often fail to guarantee the separability of learned representation. To tackle this issue, we put forward a novel approach to jointly learn both domain-shared and discriminative representations. Specifically, we model the feature discrimination explicitly for two domains. Alternating discriminant optimization is proposed to obtain discriminative features with an l2 constraint in labeled source domain and sparse filtering is introduced to capture the intrinsic structures exists in the unlabeled target domain. Finally, they are integrated in a unified framework along with MMD to align domains. Extensive experiments compared with state-of-the-art methods verify the effectiveness of our method on cross-domain tasks.

15.
Entropy (Basel) ; 21(2)2019 Feb 10.
Article in English | MEDLINE | ID: mdl-33266879

ABSTRACT

Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments.

16.
Sensors (Basel) ; 18(6)2018 Jun 11.
Article in English | MEDLINE | ID: mdl-29891816

ABSTRACT

Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

17.
Sensors (Basel) ; 17(9)2017 Sep 18.
Article in English | MEDLINE | ID: mdl-28927017

ABSTRACT

As an important tool of information fusion, Dempster-Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster-Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster's combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method.

18.
PLoS One ; 12(5): e0176832, 2017.
Article in English | MEDLINE | ID: mdl-28481914

ABSTRACT

How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.


Subject(s)
Entropy , Algorithms , Uncertainty
19.
PLoS One ; 12(5): e0177695, 2017.
Article in English | MEDLINE | ID: mdl-28542549

ABSTRACT

Dempster-Shafer evidence theory has been widely used in various applications. However, to solve the problem of counter-intuitive outcomes by using classical Dempster-Shafer combination rule is still an open issue while fusing the conflicting evidences. Many approaches based on discounted evidence and weighted average evidence have been investigated and have made significant improvements. Nevertheless, all of these approaches have inherent flaws. In this paper, a new weighting factor is proposed to address this problem. First, a modified dissimilarity measurement is proposed which is characterized by both distance and conflict between evidences. Second, a measurement of information volume of each evidence based on Deng entropy is introduced. Then two kinds of weight derived from aforementioned measurement are combined to obtain a new weighting factor and a weighted average method based on the new weighting factor is proposed. Numerical examples are used to illustrate the validity and effectiveness of the proposed method. In the end, the new method is applied to a real-life application of river water quality monitoring, which effectively identify the major land use activities contributing to river pollution.


Subject(s)
Environmental Monitoring , Environmental Pollution/prevention & control , Rivers/chemistry , Entropy , Water Quality
20.
Sensors (Basel) ; 17(4)2017 Apr 22.
Article in English | MEDLINE | ID: mdl-28441736

ABSTRACT

In real applications, how to measure the uncertain degree of sensor reports before applying sensor data fusion is a big challenge. In this paper, in the frame of Dempster-Shafer evidence theory, a weighted belief entropy based on Deng entropy is proposed to quantify the uncertainty of uncertain information. The weight of the proposed belief entropy is based on the relative scale of a proposition with regard to the frame of discernment (FOD). Compared with some other uncertainty measures in Dempster-Shafer framework, the new measure focuses on the uncertain information represented by not only the mass function, but also the scale of the FOD, which means less information loss in information processing. After that, a new multi-sensor data fusion approach based on the weighted belief entropy is proposed. The rationality and superiority of the new multi-sensor data fusion method is verified according to an experiment on artificial data and an application on fault diagnosis of a motor rotor.

SELECTION OF CITATIONS
SEARCH DETAIL
...