Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters










Database
Language
Publication year range
1.
MethodsX ; 11: 102269, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37457433

ABSTRACT

This paper shows a real overview of the interconnection and automated integration of 4.0 machinery within the supply chain or logistics of two companies in the southern Italian territory. The authors provide an exhaustive analysis of the Italian legislation and the strict requirements in order to assess which investments are part of Industry 4.0 with a focus on business risk. The work also shows the potential of a new framework developed that allows using OPC-UA and Modbus protocols to access the functional variables of the 4.0 machinery in a bidirectional way, directly from cloud applications. The proposed solutions help companies to develop more efficient production processes and to fulfil the requirements imposed by Italian regulations in order to benefit from Industry 4.0 financial aid.

2.
BMC Bioinformatics ; 24(1): 231, 2023 Jun 05.
Article in English | MEDLINE | ID: mdl-37271819

ABSTRACT

When it was first introduced in 2000, reverse vaccinology was defined as an in silico approach that begins with the pathogen's genomic sequence. It concludes with a list of potential proteins with a possible, but not necessarily, list of peptide candidates that need to be experimentally confirmed for vaccine production. During the subsequent years, reverse vaccinology has dramatically changed: now it consists of a large number of bioinformatics tools and processes, namely subtractive proteomics, computational vaccinology, immunoinformatics, and in silico related procedures. However, the state of the art of reverse vaccinology still misses the ability to predict the efficacy of the proposed vaccine formulation. Here, we describe how to fill the gap by introducing an advanced immune system simulator that tests the efficacy of a vaccine formulation against the disease for which it has been designed. As a working example, we entirely apply this advanced reverse vaccinology approach to design and predict the efficacy of a potential vaccine formulation against influenza H5N1. Climate change and melting glaciers are critical due to reactivating frozen viruses and emerging new pandemics. H5N1 is one of the potential strains present in icy lakes that can raise a pandemic. Investigating structural antigen protein is the most profitable therapeutic pipeline to generate an effective vaccine against H5N1. In particular, we designed a multi-epitope vaccine based on predicted epitopes of hemagglutinin and neuraminidase proteins that potentially trigger B-cells, CD4, and CD8 T-cell immune responses. Antigenicity and toxicity of all predicted CTL, Helper T-lymphocytes, and B-cells epitopes were evaluated, and both antigenic and non-allergenic epitopes were selected. From the perspective of advanced reverse vaccinology, the Universal Immune System Simulator, an in silico trial computational framework, was applied to estimate vaccine efficacy using a cohort of 100 digital patients.


Subject(s)
Influenza A Virus, H5N1 Subtype , Influenza Vaccines , Influenza, Human , Humans , Influenza, Human/prevention & control , Vaccinology/methods , Vaccine Efficacy , Epitopes, B-Lymphocyte , Proteins , Computational Biology/methods , Immune System , Epitopes, T-Lymphocyte/chemistry , Molecular Docking Simulation , Vaccines, Subunit/chemistry , Vaccines, Subunit/genetics
3.
J Imaging ; 7(8)2021 Jul 29.
Article in English | MEDLINE | ID: mdl-34460762

ABSTRACT

The identification of printed materials is a critical and challenging issue for security purposes, especially when it comes to documents such as banknotes, tickets, or rare collectable cards: eligible targets for ad hoc forgery. State-of-the-art methods require expensive and specific industrial equipment, while a low-cost, fast, and reliable solution for document identification is increasingly needed in many contexts. This paper presents a method to generate a robust fingerprint, by the extraction of translucent patterns from paper sheets, and exploiting the peculiarities of binary pattern descriptors. A final descriptor is generated by employing a block-based solution followed by principal component analysis (PCA), to reduce the overall data to be processed. To validate the robustness of the proposed method, a novel dataset was created and recognition tests were performed under both ideal and noisy conditions.

4.
Sensors (Basel) ; 21(16)2021 Aug 13.
Article in English | MEDLINE | ID: mdl-34450906

ABSTRACT

The production process of a wafer in the semiconductor industry consists of several phases such as a diffusion and associated defectivity test, parametric test, electrical wafer sort test, assembly and associated defectivity tests, final test, and burn-in. Among these, the fault detection phase is critical to maintain the low number and the impact of anomalies that eventually result in a yield loss. The understanding and discovery of the causes of yield detractors is a complex procedure of root-cause analysis. Many parameters are tracked for fault detection, including pressure, voltage, power, or valve status. In the majority of the cases, a fault is due to a combination of two or more parameters, whose values apparently stay within the designed and checked control limits. In this work, we propose an ensembled anomaly detector which combines together univariate and multivariate analyses of the fault detection tracked parameters. The ensemble is based on three proposed and compared balancing strategies. The experimental phase is conducted on two real datasets that have been gathered in the semiconductor industry and made publicly available. The experimental validation, also conducted to compare our proposal with other traditional anomaly detection techniques, is promising in detecting anomalies retaining high recall with a low number of false alarms.


Subject(s)
Algorithms , Semiconductors , Diffusion
5.
Aesthet Surg J ; 39(2): 164-173, 2019 01 17.
Article in English | MEDLINE | ID: mdl-29579138

ABSTRACT

Background: Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. Objectives: The authors quantitatively described breast shape with two parameters derived from a statistical methodology denominated by principal component analysis (PCA). Methods: The authors created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. The authors plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and posttreatment surgical case and test-retest was performed by two operators. Results: The first two principal components derived from PCA characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and posttreatment stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. Conclusions: This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.


Subject(s)
Breast Neoplasms/surgery , Breast/diagnostic imaging , Imaging, Three-Dimensional/methods , Mammaplasty/standards , Mastectomy/adverse effects , Adult , Aged , Anatomic Landmarks/anatomy & histology , Anatomic Landmarks/diagnostic imaging , Benchmarking/methods , Breast/anatomy & histology , Breast/surgery , Female , Humans , Imaging, Three-Dimensional/instrumentation , Infrared Rays , Middle Aged , Outcome Assessment, Health Care/methods , Principal Component Analysis , Smartphone , Young Adult
6.
Comput Biol Med ; 77: 23-39, 2016 10 01.
Article in English | MEDLINE | ID: mdl-27498058

ABSTRACT

Automatic food understanding from images is an interesting challenge with applications in different domains. In particular, food intake monitoring is becoming more and more important because of the key role that it plays in health and market economies. In this paper, we address the study of food image processing from the perspective of Computer Vision. As first contribution we present a survey of the studies in the context of food image processing from the early attempts to the current state-of-the-art methods. Since retrieval and classification engines able to work on food images are required to build automatic systems for diet monitoring (e.g., to be embedded in wearable cameras), we focus our attention on the aspect of the representation of the food images because it plays a fundamental role in the understanding engines. The food retrieval and classification is a challenging task since the food presents high variableness and an intrinsic deformability. To properly study the peculiarities of different image representations we propose the UNICT-FD1200 dataset. It was composed of 4754 food images of 1200 distinct dishes acquired during real meals. Each food plate is acquired multiple times and the overall dataset presents both geometric and photometric variabilities. The images of the dataset have been manually labeled considering 8 categories: Appetizer, Main Course, Second Course, Single Course, Side Dish, Dessert, Breakfast, Fruit. We have performed tests employing different representations of the state-of-the-art to assess the related performances on the UNICT-FD1200 dataset. Finally, we propose a new representation based on the perceptual concept of Anti-Textons which is able to encode spatial information between Textons outperforming other representations in the context of food retrieval and Classification.


Subject(s)
Algorithms , Food/classification , Image Processing, Computer-Assisted/methods , Information Storage and Retrieval/methods , Pattern Recognition, Automated/methods , Cell Phone , Diet/classification , Humans , Mobile Applications , Photography
7.
IEEE Trans Image Process ; 16(12): 2905-15, 2007 Dec.
Article in English | MEDLINE | ID: mdl-18092590

ABSTRACT

Palette re-ordering is an effective approach for improving the compression of color-indexed images. If the spatial distribution of the indexes in the image is smooth, greater compression ratios may be obtained. As is already known, obtaining an optimal re-indexing scheme is not a trivial task. In this paper, we provide a novel algorithm for palette re-ordering problem making use of a motor map neural network. Experimental results show the real effectiveness of the proposed method both in terms of compression ratio and zero-order entropy of local differences. Also, its computational complexity is competitive with previous works in the field.


Subject(s)
Algorithms , Color , Colorimetry/methods , Data Compression/methods , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Pattern Recognition, Automated/methods , Reproducibility of Results , Sensitivity and Specificity , Signal Processing, Computer-Assisted
8.
IEEE Trans Image Process ; 13(11): 1419-23, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15540451

ABSTRACT

The efficiency of lossless compression algorithms for fixed-palette images (indexed images) may change if a different indexing scheme is adopted. Many lossless compression algorithms adopt a differential-predictive approach. Hence, if the spatial distribution of the indexes over the image is smooth, greater compression ratios may be obtained. Because of this, finding an indexing scheme that realizes such a smooth distribution is a relevant issue. Obtaining an optimal re-indexing scheme is suspected to be a hard problem and only approximate solutions have been provided in literature. In this paper, we restate the re-indexing problem as a graph optimization problem: an optimal re-indexing corresponds to the heaviest Hamiltonian path in a weighted graph. It follows that any algorithm which finds a good approximate solution to this graph-theoretical problem also provides a good re-indexing. We propose a simple and easy-to-implement approximation algorithm to find such a path. The proposed technique compares favorably with most of the algorithms proposed in literature, both in terms of computational complexity and of compression ratio.


Subject(s)
Algorithms , Color , Computer Graphics , Data Compression/methods , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Pattern Recognition, Automated/methods , Artificial Intelligence , Cluster Analysis , Hypermedia , Numerical Analysis, Computer-Assisted , Reproducibility of Results , Sensitivity and Specificity , Signal Processing, Computer-Assisted
SELECTION OF CITATIONS
SEARCH DETAIL
...