Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 38
Filter
Add more filters










Publication year range
1.
Heliyon ; 10(9): e29764, 2024 May 15.
Article in English | MEDLINE | ID: mdl-38694130

ABSTRACT

The parameter identification of failure models for composite plies can be cumbersome, due to multiple effects as the consequence of brittle fracture. Our work proposes an iterative, nonlinear design of experiments (DoE) approach that finds the most informative experimental data to identify the parameters of the Tsai-Wu, Tsai-Hill, Hoffman, Hashin, max stress and Puck failure models. Depending on the data, the models perform differently, therefore, the parameter identification is validated by the Euclidean distance of the measured points to the closest ones on the nominal surface. The resulting errors provide a base for the ranking of the models, which helps to select the best fitting. Following the validation, the sensitivity of the best model is calculated by partial differentiation, and a theoretical surface is generated. Lastly, an iterative design of the experiments is implemented to select the optimal set of experiments from which the parameters can be identified from the least data by minimizing the fitting error. In this way, the number of experiments required for the identification of a model of a composite material can be significantly reduced. We demonstrate how the proposed method selected the most optimal experiments out of generated data. The results indicate that if the dataset contains enough information, the method is robust and accurate. If the data set lacks the necessary information, novel material tests can be proposed based on the optimal points of the parameters' sensitivity of the generated failure model surface.

2.
PLoS One ; 19(5): e0301262, 2024.
Article in English | MEDLINE | ID: mdl-38722864

ABSTRACT

Frequent sequence pattern mining is an excellent tool to discover patterns in event chains. In complex systems, events from parallel processes are present, often without proper labelling. To identify the groups of events related to the subprocess, frequent sequential pattern mining can be applied. Since most algorithms provide too many frequent sequences that make it difficult to interpret the results, it is necessary to post-process the resulting frequent patterns. The available visualisation techniques do not allow easy access to multiple properties that support a faster and better understanding of the event scenarios. To answer this issue, our work proposes an intuitive and interactive solution to support this task, introducing three novel network-based sequence visualisation methods that can reduce the time of information processing from a cognitive perspective. The proposed visualisation methods offer a more information rich and easily understandable interpretation of sequential pattern mining results compared to the usual text-like outcome of pattern mining algorithms. The first uses the confidence values of the transitions to create a weighted network, while the second enriches the adjacency matrix based on the confidence values with similarities of the transitive nodes. The enriched matrix enables a similarity-based Multidimensional Scaling (MDS) projection of the sequences. The third method uses similarity measurement based on the overlap of the occurrences of the supporting events of the sequences. The applicability of the method is presented in an industrial alarm management problem and in the analysis of clickstreams of a website. The method was fully implemented in Python environment. The results show that the proposed methods are highly applicable for the interactive processing of frequent sequences, supporting the exploration of the inner mechanisms of complex systems.


Subject(s)
Algorithms , Data Mining/methods , Humans
3.
Heliyon ; 10(8): e29437, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38655321

ABSTRACT

This paper presents a methodology that aims to enhance the accuracy of probability density estimation in mobility pattern analysis by integrating prior knowledge of system dynamics and contextual information into the particle filter algorithm. The quality of the data used for density estimation is often inadequate due to measurement noise, which significantly influences the distribution of the measurement data. Thus, it is crucial to augment the information content of the input data by incorporating additional sources of information beyond the measured position data. These other sources can include the dynamic model of movement and the spatial model of the environment, which influences motion patterns. To effectively combine the information provided by positional measurements with system and environment models, the particle filter algorithm is employed, which generates discrete probability distributions. By subjecting these discrete distributions to exploratory techniques, it becomes possible to extract more certain information compared to using raw measurement data alone. Consequently, this study proposes a methodology in which probability density estimation is not solely based on raw positional data but rather on probability-weighted samples generated through the particle filter. This approach yields more compact and precise modeling distributions. Specifically, the method is applied to process position measurement data using a nonparametric density estimator known as kernel density estimation. The proposed methodology is thoroughly tested and validated using information-theoretic and probability metrics. The applicability of the methodology is demonstrated through a practical example of mobility pattern analysis based on forklift data in a warehouse environment.

4.
Sci Rep ; 14(1): 9036, 2024 Apr 19.
Article in English | MEDLINE | ID: mdl-38641683

ABSTRACT

In real-world classification problems, it is important to build accurate prediction models and provide information that can improve decision-making. Decision-support tools are often based on network models, and this article uses information encoded by social networks to solve the problem of employer turnover. However, understanding the factors behind black-box prediction models can be challenging. Our question was about the predictability of employee turnover, given information from the multilayer network that describes collaborations and perceptions that assess the performance of organizations that indicate the success of cooperation. Our goal was to develop an accurate prediction procedure, preserve the interpretability of the classification, and capture the wide variety of specific reasons that explain positive cases. After a feature engineering, we identified variables with the best predictive power using decision trees and ranked them based on their added value considering their frequent co-occurrence. We applied the Random Forest using the SMOTE balancing technique for prediction. We calculated the SHAP values to identify the variables that contribute the most to individual predictions. As a last step, we clustered the sample based on SHAP values to fine-tune the explanations for quitting due to different background factors.

5.
Sensors (Basel) ; 24(3)2024 Jan 23.
Article in English | MEDLINE | ID: mdl-38339436

ABSTRACT

This paper proposes a monitoring procedure based on characterizing state probability distributions estimated using particle filters. The work highlights what types of information can be obtained during state estimation and how the revealed information helps to solve fault diagnosis tasks. If a failure is present in the system, the output predicted by the model is inconsistent with the actual output, which affects the operation of the estimator. The heterogeneity of the probability distribution of states increases, and a large proportion of the particles lose their information content. The correlation structure of the posterior probability density can also be altered by failures. The proposed method uses various indicators that characterize the heterogeneity and correlation structure of the state distribution, as well as the consistency between model predictions and observed behavior, to identify the effects of failures.The applicability of the utilized measures is demonstrated through a dynamic vehicle model, where actuator and sensor failure scenarios are investigated.

6.
Heliyon ; 10(4): e25946, 2024 Feb 29.
Article in English | MEDLINE | ID: mdl-38404856

ABSTRACT

Detecting chemical, biological, radiological and nuclear (CBRN) incidents is a high priority task and has been a topic of intensive research for decades. Ongoing technological, data processing, and automation developments are opening up new potentials in CBRN protection, which has become a complex, interdisciplinary field of science. According to it, chemists, physicists, meteorologists, military experts, programmers, and data scientists are all involved in the research. The key to effectively enhancing CBRN defence capabilities is continuous and targeted development along a well-structured concept. Our study highlights the importance of predictive analytics by providing an overview of the main components of modern CBRN defence technologies, including a summary of the conceptual requirements for CBRN reconnaissance and decision support steps, and by presenting the role and recent opportunities of information management in these processes.

7.
MethodsX ; 11: 102260, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37388166

ABSTRACT

While the primary focus of Industry 4.0 revolves around extensive digitalization, Industry 5.0, on the other hand, seeks to integrate innovative technologies with human actors, signifying an approach that is more value-driven than technology-centric. The key objectives of the Industry 5.0 paradigm, which were not central to Industry 4.0, underscore that production should not only be digitalized but also resilient, sustainable, and human-centric. This paper is focusing on the human-centric pillar of Industry 5.0. The proposed methodology addresses the need for a human-AI collaborative process design and innovation approach to support the development and deployment of advanced AI-driven co-creation and collaboration tools. The method aims to solve the problem of integrating various innovative agents (human, AI, IoT, robot) in a plant-level collaboration process through a generic semantic definition, utilizing a time event-driven process. It also encourages the development of AI techniques for human-in-the-loop optimization, incorporating cross-checking with alternative feedback loop models. Benefits of this methodology include the Industry 5.0 collaboration architecture (I5arc), which provides new adaptable, generic frameworks, concepts, and methodologies for modern knowledge creation and sharing to enhance plant collaboration processes. •The I5arc aims to investigate and establish a truly integrated human-AI collaboration model, equipped with methods and tools for human-AI driven co-creation.•Provide a framework for the co-execution of processes and activities, with humans remaining empowered and in control.•The framework primarily targets human-AI collaboration processes and activities in industrial plants, with potential applicability to other societal contexts.

8.
PLoS One ; 18(4): e0284078, 2023.
Article in English | MEDLINE | ID: mdl-37053261

ABSTRACT

Non-negative matrix factorization (NMF) efficiently reduces high dimensionality for many-objective ranking problems. In multi-objective optimization, as long as only three or four conflicting viewpoints are present, an optimal solution can be determined by finding the Pareto front. When the number of the objectives increases, the multi-objective problem evolves into a many-objective optimization task, where the Pareto front becomes oversaturated. The key idea is that NMF aggregates the objectives so that the Pareto front can be applied, while the Sum of Ranking Differences (SRD) method selects the objectives that have a detrimental effect on the aggregation, and validates the findings. The applicability of the method is illustrated by the ranking of 1176 universities based on 46 variables of the CWTS Leiden Ranking 2020 database. The performance of NMF is compared to principal component analysis (PCA) and sparse non-negative matrix factorization-based solutions. The results illustrate that PCA incorporates negatively correlated objectives into the same principal component. On the contrary, NMF only allows non-negative correlations, which enable the proper use of the Pareto front. With the combination of NMF and SRD, a non-biased ranking of the universities based on 46 criteria is established, where Harvard, Rockefeller and Stanford Universities are determined as the first three. To evaluate the ranking capabilities of the methods, measures based on Relative Entropy (RE) and Hypervolume (HV) are proposed. The results confirm that the sparse NMF method provides the most informative ranking. The results highlight that academic excellence can be improved by decreasing the proportion of unknown open-access publications and short distance collaborations. The proportion of gender indicators barely correlate with scientific impact. More authors, long-distance collaborations, publications that have more scientific impact and citations on average highly influence the university ranking in a positive direction.


Subject(s)
Algorithms , Humans , Universities , Principal Component Analysis
9.
Sensors (Basel) ; 23(5)2023 Feb 23.
Article in English | MEDLINE | ID: mdl-36904704

ABSTRACT

This paper describes a framework for detecting welding errors using 3D scanner data. The proposed approach employs density-based clustering to compare point clouds and identify deviations. The discovered clusters are then classified according to standard welding fault classes. Six welding deviations defined in the ISO 5817:2014 standard were evaluated. All defects were represented through CAD models, and the method was able to detect five of these deviations. The results demonstrate that the errors can be effectively identified and grouped according to the location of the different points in the error clusters. However, the method cannot separate crack-related defects as a distinct cluster.

10.
Heliyon ; 8(12): e12263, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36568657

ABSTRACT

Quality function deployment (QFD) has been a widely-acknowledged tool for translating customer requirements into quality product characteristics based on which product development strategies and focus areas are identified. However, the QFD method considers the correlation and effect between development parameters, but it is not directly implemented in the importance ranking of development actions. Therefore, the cross-relationships between development parameters and their impact on customer requirement satisfaction are often neglected. The primary objective of this study is to make decision-making more reliable by improving QFD with methods that optimize the selection of development parameters even under capacity or cost constraints and directly implement cross-relationships between development parameters and support the identification of interactions visually. Therefore, QFD is accessed from two approaches that proved efficient in operations research. 1) QFD is formulated as a network flow problem with two objectives: maximizing the benefits of satisfying customer needs using linear optimization or minimizing the total cost of actions while still meeting customer requirements using assignment of minimum cost flow approach. 2) QFD is represented as a hypergraph, which allows efficient representation of the interactions of the relationship and correlation matrix and the determination of essential factors based on centrality metrics. The applicability of the methods is demonstrated through an application study in developing a sustainable design of customer electronic products and highlights the improvements' contribution to different development strategies, such as linear optimization performed the best in maximizing customer requirements' satisfaction, assignment as minimum cost flow approach minimized the total cost, while the hypergraph-based representation identified the indirect interactions of development parameters and customer requirements.

11.
PLoS One ; 17(10): e0274779, 2022.
Article in English | MEDLINE | ID: mdl-36201501

ABSTRACT

The discovery of human mobility patterns of cities provides invaluable information for decision-makers who are responsible for redesign of community spaces, traffic, and public transportation systems and building more sustainable cities. The present article proposes a possibilistic fuzzy c-medoid clustering algorithm to study human mobility. The proposed medoid-based clustering approach groups the typical mobility patterns within walking distance to the stations of the public transportation system. The departure times of the clustered trips are also taken into account to obtain recommendations for the scheduling of the designed public transportation lines. The effectiveness of the proposed methodology is revealed in an illustrative case study based on the analysis of the GPS data of Taxicabs recorded during nights over a one-year-long period in Budapest.


Subject(s)
Goals , Mobile Applications , Cities , Cluster Analysis , Humans , Transportation/methods
12.
J Environ Manage ; 323: 116165, 2022 Dec 01.
Article in English | MEDLINE | ID: mdl-36116263

ABSTRACT

Climate change can cause multiply potential health issues in urban areas, which is the most susceptible environment in terms of the presently increasing climate volatility. Urban greening strategies make an important part of the adaptation strategies which can ameliorate the negative impacts of climate change. It was aimed to study the potential impacts of different kinds of greenings against the adverse effects of climate change, including waterborne, vector-borne diseases, heat-related mortality, and surface ozone concentration in a medium-sized Hungarian city. As greening strategies, large and pocket parks were considered, based on our novel location identifier algorithm for climate risk minimization. A method based on publicly available data sources including satellite pictures, climate scenarios and urban macrostructure has been developed to evaluate the health-related indicator patterns in cities. The modelled future- and current patterns of the indicators have been compared. The results can help the understanding of the possible future state of the studied indicators and the development of adequate greening strategies. Another outcome of the study is that it is not the type of health indicator but its climate sensitivity that determines the extent to which it responds to temperature rises and how effective greening strategies are in addressing the expected problem posed by the factor.


Subject(s)
Climate Change , Ozone , Cities , Health Impact Assessment , Hot Temperature , Ozone/analysis , Temperature , Urban Health
13.
Sensors (Basel) ; 22(11)2022 Jun 03.
Article in English | MEDLINE | ID: mdl-35684889

ABSTRACT

The present research presents a framework that supports the development and operation of machine-learning (ML) algorithms to develop, maintain and manage the whole lifecycle of modeling software sensors related to complex chemical processes. Our motivation is to take advantage of ML and edge computing and offer innovative solutions to the chemical industry for difficult-to-measure laboratory variables. The purpose of software sensor models is to continuously forecast the quality of products to achieve effective quality control, maintain the stable production condition of plants, and support efficient, environmentally friendly, and harmless laboratory work. As a result of the literature review, quite a few ML models have been developed in recent years that support the quality assurance of different types of materials. However, the problems of continuous operation, maintenance and version control of these models have not yet been solved. The method uses ML algorithms and takes advantage of cloud services in an enterprise environment. Industrial 4.0 devices such as the Internet of Things (IoT), edge computing, cloud computing, ML, and artificial intelligence (AI) are core techniques. The article outlines an information system structure and the related methodology based on data from a quality-assurance laboratory. During the development, we encountered several challenges resulting from the continuous development of ML models and the tuning of their parameters. The article discusses the development, version control, validation, lifecycle, and maintenance of ML models and a case study. The developed framework can continuously monitor the performance of the models and increase the amount of data that make up the models. As a result, the most accurate, data-driven and up-to-date models are always available to quality-assurance engineers with this solution.


Subject(s)
Artificial Intelligence , Internet of Things , Cloud Computing , Machine Learning , Software
14.
PLoS One ; 17(2): e0264277, 2022.
Article in English | MEDLINE | ID: mdl-35213620

ABSTRACT

The Promethee-GAIA method is a multicriteria decision support technique that defines the aggregated ranks of multiple criteria and visualizes them based on Principal Component Analysis (PCA). In the case of numerous criteria, the PCA biplot-based visualization do not perceive how a criterion influences the decision problem. The central question is how the Promethee-GAIA-based decision-making process can be improved to gain more interpretable results that reveal more characteristic inner relationships between the criteria. To improve the Promethee-GAIA method, we suggest three techniques that eliminate redundant criteria as well as clearly outline, which criterion belongs to which factor and explore the similarities between criteria. These methods are the following: A) Principal factoring with rotation and communality analysis (P-PFA), B) the integration of Sparse PCA into the Promethee II method (P-sPCA), and C) the Sum of Ranking Differences method (P-SRD). The suggested methods are presented through an I4.0+ dataset that measures the Industry 4.0 readiness of NUTS 2-classified regions. The proposed methods are useful tools for handling multicriteria ranking problems, if the number of criteria is numerous.


Subject(s)
Decision Support Techniques , Models, Theoretical , Factor Analysis, Statistical , Industry
15.
Sustain Cities Soc ; 76: 103422, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34729296

ABSTRACT

A suitable tool for monitoring the spread of SARS-CoV-2 is to identify potential sampling points in the wastewater collection system that can be used to monitor the distribution of COVID-19 disease affected clusters within a city. The applicability of the developed methodology is presented through the description of the 72,837 population equivalent wastewater collection system of the city of Nagykanizsa, Hungary and the results of the analytical and epidemiological measurements of the wastewater samples. The wastewater sampling was conducted during the 3rd wave of the COVID-19 epidemic. It was found that the overlap between the road system and the wastewater network is high, it is 82 %. It was showed that the proposed methodological approach, using the tools of network science, determines confidently the zones of the wastewater collection system and provides the ideal monitoring points in order to provide the best sampling resolution in urban areas. The strength of the presented approach is that it estimates the network based on publicly available information. It was concluded that the number of zones or sampling points can be chosen based on relevant epidemiological intervention and mitigation strategies. The algorithm allows for continuous effective monitoring of the population infected by SARS-CoV-2 in small-sized cities.

16.
Sensors (Basel) ; 23(1)2022 Dec 27.
Article in English | MEDLINE | ID: mdl-36616880

ABSTRACT

One of the main challenges of Industry 4.0 is how advanced sensors and sensing technologies can be applied through the Internet of Things layers of existing manufacturing. This is the so-called Brownfield Industry 4.0, where the different types and ages of machines and processes need to be digitalized. Smart retrofitting is the umbrella term for solutions to show how we can digitalize manufacturing machines. This problem is critical in the case of solutions to support human workers. The Operator 4.0 concept shows how we can efficiently support workers on the shop floor. The key indicator is the readiness level of a company, and the main bottleneck is the technical knowledge of the employees. This study proposes an education framework and a related Operator 4.0 laboratory that prepares students for the development and application of Industry 5.0 technologies. The concept of intelligent space is proposed as a basis of the educational framework, which can solve the problem of monitoring the stochastic nature of operators in production processes. The components of the intelligent space are detailed through the layers of the IoT in the form of a case study conducted at the laboratory. The applicability of indoor positioning systems is described with the integration of machine-, operator- and environment-based sensor data to obtain real-time information from the shop floor. The digital twin of the laboratory is developed in a discrete event simulator, which integrates the data from the shop floor and can control the production based on the simulation results. The presented framework can be utilized to design education for the generation of Industry 5.0.


Subject(s)
Industry , Students , Humans , Commerce , Computer Simulation , Intelligence
17.
Sensors (Basel) ; 21(10)2021 May 12.
Article in English | MEDLINE | ID: mdl-34065951

ABSTRACT

The targeted shortening of sensor development requires short and convincing verification tests. The goal of the development of novel verification methods is to avoid or reduce an excessive amount of testing and identify tests that guarantee that the assumed failure will not happen in practice. In this paper, a method is presented that results in the test loads of such a verification. The method starts with the identification of the requirements for the product related to robustness using the precise descriptions of those use case scenarios in which the product is assumed to be working. Based on the logic of the Quality Function Deployment (QFD) method, a step-by-step procedure has been developed to translate the robustness requirements through the change in design parameters, their causing phenomena, the physical quantities as causes of these phenomena, until the test loads of the verification. The developed method is applied to the test plan of an automotive sensor. The method is general and can be used for any parts of a vehicle, including mechanical, electrical and mechatronical ones, such as sensors and actuators. Nonetheless, the method is applicable in a much broader application area, even outside of the automotive industry.

18.
Data Brief ; 36: 106978, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33869697

ABSTRACT

The data article presents a dataset and a tool for news-based monitoring of sustainable development goals defined by the United Nations. The presented dataset was created by structured queries of the GDELT database based on the categories of the World Bank taxonomy matched to sustainable development goals. The Google BigQuery SQL scripts and the results of the related network analysis are attached to the data to provide a toolset for the strategic management of sustainability issues. The article demonstrates the dataset on the 6th sustainable development goal (Clean Water and Sanitation). The network formed based on how countries appear in the same news can be used to explore the potential international cooperation. The network formed based on how topics of World Bank taxonomy appear in the same news can be used to explore how the problems and solutions of sustainability issues are interlinked.

19.
PLoS One ; 16(4): e0250247, 2021.
Article in English | MEDLINE | ID: mdl-33872343

ABSTRACT

This paper aims to identify the regional potential of Industry 4.0 (I4.0). Although the regional background of a company significantly determines how the concept of I4.0 can be introduced, the regional aspects of digital transformation are often neglected with regard to the analysis of I4.0 readiness. Based on the analysis of the I4.0 readiness models, the external regional success factors of the implementation of I4.0 solutions are determined. An I4.0+ (regional Industry 4.0) readiness model, a specific indicator system is developed to foster medium-term regional I4.0 readiness analysis and foresight planning. The indicator system is based on three types of data sources: (1) open governmental data; (2) alternative metrics like the number of I4.0-related publications and patent applications; and (3) the number of news stories related to economic and industrial development. The indicators are aggregated to the statistical regions (NUTS 2), and their relationships analyzed using the Sum of Ranking Differences (SRD) and Promethee II methods. The developed I4.0+ readiness index correlates with regional economic, innovation and competitiveness indexes, which indicates the importance of boosting regional I4.0 readiness.


Subject(s)
Automation/economics , Economic Development/trends , Industry/trends , Automation/methods , Benchmarking , Government , Industry/statistics & numerical data
20.
Heliyon ; 7(2): e06174, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33598579

ABSTRACT

This study aims to bring about a novel approach to the analysis of Sustainable Development Goals (SDGs) based solely on the appearance of news. Our purpose is to provide a monitoring tool that enables world news to be detected in an SDG-oriented manner, by considering multilingual as well as wide geographic coverage. The association of the goals with news basis the World Bank Group Topical Taxonomy, from which the selection of search words approximates the 17 development goals. News is extracted from The GDELT Project (Global Database of Events, Language and Tone) which gathers both printed as well as online news from around the world. 60 851 572 relevant news stories were identified in 2019. The intertwining of world news with SDGs as well as connections between countries are interpreted and highlight that even in the most SDG-sensitive countries, only 2.5% of the news can be attributed to the goals. Most of the news about sustainability appears in Africa as well as East and Southeast Asia, moreover typically the most negative tone of news can be observed in Africa. In the case of climate change (SDG 13), the United States plays a key role in both the share of news and the negative tone. Using the tools of network science, it can be verified that SDGs can be characterized on the basis of world news. This news-centred network analysis of SDGs identifies global partnerships as well as national stages of implementation towards a sustainable socio-environmental ecosystem. In the field of sustainability, it is vital to form the attitudes and environmental awareness of people, which strategic plans cannot address but can be measured well through the news.

SELECTION OF CITATIONS
SEARCH DETAIL
...