Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Entropy (Basel) ; 26(3)2024 Feb 22.
Article in English | MEDLINE | ID: mdl-38539697

ABSTRACT

We explore formal similarities and mathematical transformation formulas between general trace-form entropies and the Gini index, originally used in quantifying income and wealth inequalities. We utilize the notion of gintropy introduced in our earlier works as a certain property of the Lorenz curve drawn in the map of the tail-integrated cumulative population and wealth fractions. In particular, we rediscover Tsallis' q-entropy formula related to the Pareto distribution. As a novel result, we express the traditional entropy in terms of gintropy and reconstruct further non-additive formulas. A dynamical model calculation of the evolution of Gini index is also presented.

2.
Sci Rep ; 12(1): 18026, 2022 Oct 27.
Article in English | MEDLINE | ID: mdl-36302821

ABSTRACT

The aim of this paper is to perform uni- and multivariate time series classification tasks with linear law-based feature space transformation (LLT). First, LLT is used to separate the training and test sets of instances. Then, it identifies the governing patterns (laws) of each input sequence in the training set by applying time-delay embedding and spectral decomposition. Finally, it uses the laws of the training set to transform the feature space of the test set. These calculation steps have a low computational cost and the potential to form a learning algorithm. For the empirical study of LLT, a widely used human activity recognition database called AReM is employed. Based on the results, LLT vastly increases the accuracy of traditional classifiers, outperforming state-of-the-art methods after the proposed feature space transformation is applied. The fastest error-free classification on the test set is achieved by combining LLT and the k-nearest neighbor (KNN) algorithm while performing fivefold cross-validation.

3.
Entropy (Basel) ; 24(9)2022 Sep 17.
Article in English | MEDLINE | ID: mdl-36141199

ABSTRACT

In this paper, we explore a new approach in which understanding is interpreted as a set representation. We prove that understanding/representation, finding the appropriate coordination of data, is equivalent to finding the minimum of the representational entropy. For the control of the search for the correct representation, we propose a loss function as a combination of the representational entropy, type one and type two errors. Computational complexity estimates are presented for the process of understanding and using the representation found.

4.
Phys Rev Lett ; 94(13): 132302, 2005 Apr 08.
Article in English | MEDLINE | ID: mdl-15903987

ABSTRACT

We show that the well-known linear Langevin equation, modeling the Brownian motion and leading to a Gaussian stationary distribution of the corresponding Fokker-Planck equation, is changed by the smallest multiplicative noise. This leads to a power-law tail of the distribution for sufficiently large momenta. At finite ratio of the correlation strength for the multiplicative and the additive noises the stationary energy distribution becomes exactly the Tsallis distribution.

SELECTION OF CITATIONS
SEARCH DETAIL
...