ABSTRACT
In this work, we provide further development of the junction tree variational autoencoder (JT VAE) architecture in terms of implementation and application of the internal feature space of the model. Pretraining of JT VAE on a large dataset and further optimization with a regression model led to a latent space that can solve several tasks simultaneously: prediction, generation, and optimization. We use the ZINC database as a source of molecules for the JT VAE pretraining and the QM9 dataset with its HOMO values to show the application case. We evaluate our model on multiple tasks such as property (value) prediction, generation of new molecules with predefined properties, and structure modification toward the property. Across these tasks, our model shows improvements in generation and optimization tasks while preserving the precision of state-of-the-art models.
ABSTRACT
On reviewing the data in the literature it is obvious that the differences in the concentrations of certain special elements in human hair for a various number of people are too large to be used as standards or deviations from standards for determining the trace-elemental composition of hair. New questions have arisen with the publication of non-compatible data. What is the distribution of elements on the donor's head area? What is the character of this distribution? Is the distribution function identical for all elements? Hair samples were taken from five points on the heads of six people. The hair samples were analysed using the method of X-ray fluorescence excited by synchrotron radiation (SRXRF), and the results were compared with those from a similar sample analysed using the method of total reflection X-ray fluorescence (TXRF). Elements which show a constant concentration all over the head are identified.