Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 6.029
Filtrar
1.
Neural Netw ; 181: 106766, 2024 Sep 26.
Artículo en Inglés | MEDLINE | ID: mdl-39357267

RESUMEN

Bio-inspired Autonomous Underwater Vehicles with soft bodies provide significant performance benefits over conventional propeller-driven vehicles; however, it is difficult to control these vehicles due to their soft underactuated bodies. This study investigates the application of Physical Reservoir Computing (PRC) in the swimmer's flexible body to perform state estimation. This PRC informed state estimation has potential to be used in vehicle control. PRC is a type of recurrent neural network that leverages the nonlinear dynamics of a physical system to predict a nonlinear spatiotemporal input-output relationship. By embodying the neural network into the physical structure, PRC can process the response to an environment input with high computational efficiency. This study uses a soft bio-inspired propulsor embodied as a physical reservoir. We evaluate its ability to predict different state estimation tasks including hydrodynamic forces and benchmark computational tasks in response to the forcing applied to the artificial muscles during actuation. The propulsor's nonlinear fluid-structural dynamics act as the physical reservoir and the kinematic feedback serves as the reservoir readouts. We show that the bio-inspired underwater propulsor can predict the hydrodynamic thrust and benchmark tasks with high accuracy under specific input frequencies. By analyzing the frequency spectrum of the input, readouts, and target signals, we demonstrate that the system's dynamic response determines the frequency contents relevant to the task being predicted. The propulsor's ability to process information stems from its nonlinearity, as it is responsible to transform the input signal into a broader spectrum of frequency content at the readouts. This broad band of frequency content is necessary to recreate the target signal within the PRC algorithm, thereby improving the prediction performance. The spectral analysis provides a unique perspective to analyze the nonlinear dynamics of a physical reservoir and serves as a valuable tool for examining other types of vibratory systems for PRC. This work serves as a first step towards embodying computation into soft bio-inspired swimmers.

2.
Adv Mater ; : e2410432, 2024 Sep 30.
Artículo en Inglés | MEDLINE | ID: mdl-39350463

RESUMEN

Precise event detection within time-series data is increasingly critical, particularly in noisy environments. Reservoir computing, a robust computing method widely utilized with memristive devices, is efficient in processing temporal signals. However, it typically lacks intrinsic thresholding mechanisms essential for precise event detection. This study introduces a new approach by integrating two Pt/HfO2/TiN (PHT) memristors and one Ni/HfO2/n-Si (NHS) metal-oxide-semiconductor capacitor (2M1MOS) to implement a tunable thresholding function. The current-voltage nonlinearity of memristors combined with the capacitance-voltage nonlinearity of the capacitor forms the basis of the 2M1MOS kernel system. The proposed kernel hardware effectively records feature-specified information of the input signal onto the memristors through capacitive thresholding. In electrocardiogram analysis, the memristive response exhibited a more than ten-fold difference between arrhythmia and normal beats. In isolated spoken digit classification, the kernel achieved an error rate of only 0.7% by tuning thresholds for various time-specific conditions. The kernel is also applied to biometric authentication by extracting personal features using various threshold times, presenting more complex and multifaceted uses of heartbeats and voice data as bio-indicators. These demonstrations highlight the potential of thresholding computing in a memristive framework with heterogeneous integration.

3.
Cytometry A ; 2024 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-39351999

RESUMEN

Imaging flow cytometry (IFC) provides single-cell imaging data at a high acquisition rate. It is increasingly used in image-based profiling experiments consisting of hundreds of thousands of multi-channel images of cells. Currently available software solutions for processing microscopy data can provide good results in downstream analysis, but are limited in efficiency and scalability, and often ill-adapted to IFC data. In this work, we propose Scalable Cytometry Image Processing (SCIP), a Python software that efficiently processes images from IFC and standard microscopy datasets. We also propose a file format for efficiently storing IFC data. We showcase our contributions on two large-scale microscopy and one IFC datasets, all of which are publicly available. Our results show that SCIP can extract the same kind of information as other tools, in a much shorter time and in a more scalable manner.

4.
JMIR Form Res ; 8: e58241, 2024 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-39352736

RESUMEN

BACKGROUND: Online mental health communities (OMHCs) are an effective and accessible channel to give and receive social support for individuals with mental and emotional issues. However, a key challenge on these platforms is finding suitable partners to interact with given that mechanisms to match users are currently underdeveloped or highly naive. OBJECTIVE: In this study, we collaborated with one of the world's largest OMHCs; our contribution is to show the application of agent-based modeling for the design of online community matching algorithms. We developed an agent-based simulation framework and showcased how it can uncover trade-offs in different matching algorithms between people seeking support and volunteer counselors. METHODS: We used a comprehensive data set spanning January 2020 to April 2022 to create a simulation framework based on agent-based modeling that replicates the current matching mechanisms of our research site. After validating the accuracy of this simulated replication, we used this simulation framework as a "sandbox" to test different matching algorithms based on the deferred acceptance algorithm. We compared trade-offs among these different matching algorithms based on various metrics of interest, such as chat ratings and matching success rates. RESULTS: Our study suggests that various tensions emerge through different algorithmic choices for these communities. For example, our simulation uncovered that increased waiting time for support seekers was an inherent consequence on these sites when intelligent matching was used to find more suitable matches. Our simulation also verified some intuitive effects, such as that the greatest number of support seeker-counselor matches occurred using a "first come, first served" protocol, whereas relatively fewer matches occurred using a "last come, first served" protocol. We also discuss practical findings regarding matching for vulnerable versus overall populations. Results by demographic group revealed disparities-underaged and gender minority groups had lower average chat ratings and higher blocking rates on the site when compared to their majority counterparts, indicating the potential benefits of algorithmically matching them. We found that some protocols, such as a "filter"-based approach that matched vulnerable support seekers only with a counselor of their same demographic, led to improvements for these groups but resulted in lower satisfaction (-12%) among the overall population. However, this trade-off between minority and majority groups was not observed when using "topic" as a matching criterion. Topic-based matching actually outperformed the filter-based protocol among underaged people and led to significant improvements over the status quo among all minority and majority groups-specifically, a 6% average chat rating improvement and a decrease in blocking incidents from 5.86% to 4.26%. CONCLUSIONS: Agent-based modeling can reveal significant design considerations in the OMHC context, including trade-offs in various outcome metrics and the potential benefits of algorithmic matching for marginalized communities.


Asunto(s)
Algoritmos , Humanos , Análisis de Sistemas , Servicios de Salud Mental , Apoyo Social , Salud Mental , Internet
5.
BMC Bioinformatics ; 25(1): 319, 2024 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-39354372

RESUMEN

BACKGROUND: Single-cell RNA sequencing (scRNAseq) offers powerful insights, but the surge in sample sizes demands more computational power than local workstations can provide. Consequently, high-performance computing (HPC) systems have become imperative. Existing web apps designed to analyze scRNAseq data lack scalability and integration capabilities, while analysis packages demand coding expertise, hindering accessibility. RESULTS: In response, we introduce scRNAbox, an innovative scRNAseq analysis pipeline meticulously crafted for HPC systems. This end-to-end solution, executed via the SLURM workload manager, efficiently processes raw data from standard and Hashtag samples. It incorporates quality control filtering, sample integration, clustering, cluster annotation tools, and facilitates cell type-specific differential gene expression analysis between two groups. We demonstrate the application of scRNAbox by analyzing two publicly available datasets. CONCLUSION: ScRNAbox is a comprehensive end-to-end pipeline designed to streamline the processing and analysis of scRNAseq data. By responding to the pressing demand for a user-friendly, HPC solution, scRNAbox bridges the gap between the growing computational demands of scRNAseq analysis and the coding expertise required to meet them.


Asunto(s)
Análisis de Secuencia de ARN , Análisis de la Célula Individual , Programas Informáticos , Análisis de la Célula Individual/métodos , Análisis de Secuencia de ARN/métodos , Humanos , Biología Computacional/métodos
6.
BMC Bioinformatics ; 25(1): 321, 2024 Oct 03.
Artículo en Inglés | MEDLINE | ID: mdl-39358680

RESUMEN

BACKGROUND: Several computational and mathematical models of protein synthesis have been explored to accomplish the quantitative analysis of protein synthesis components and polysome structure. The effect of gene sequence (coding and non-coding region) in protein synthesis, mutation in gene sequence, and functional model of ribosome needs to be explored to investigate the relationship among protein synthesis components further. Ribosomal computing is implemented by imitating the functional property of protein synthesis. RESULT: In the proposed work, a general framework of ribosomal computing is demonstrated by developing a computational model to present the relationship between biological details of protein synthesis and computing principles. Here, mathematical abstractions are chosen carefully without probing into intricate chemical details of the micro-operations of protein synthesis for ease of understanding. This model demonstrates the cause and effect of ribosome stalling during protein synthesis and the relationship between functional protein and gene sequence. Moreover, it also reveals the computing nature of ribosome molecules and other protein synthesis components. The effect of gene mutation on protein synthesis is also explored in this model. CONCLUSION: The computational model for ribosomal computing is implemented in this work. The proposed model demonstrates the relationship among gene sequences and protein synthesis components. This model also helps to implement a simulation environment (a simulator) for generating protein chains from gene sequences and can spot the problem during protein synthesis. Thus, this simulator can identify a disease that can happen due to a protein synthesis problem and suggest precautions for it.


Asunto(s)
Biología Computacional , Biosíntesis de Proteínas , Ribosomas , Ribosomas/metabolismo , Biología Computacional/métodos , Simulación por Computador , Mutación
7.
ACS Nano ; 2024 Oct 03.
Artículo en Inglés | MEDLINE | ID: mdl-39361524

RESUMEN

As growth in global demand for computing power continues to outpace ongoing improvements in transistor-based hardware, novel computing solutions are required. One promising approach employs stochastic nanoscale devices to accelerate probabilistic computing algorithms. Percolating Networks of Nanoparticles (PNNs) exhibit stochastic spiking, which is of particular interest as it meets criteria for criticality which is associated with a range of computational advantages. Here, we show several ways in which spiking PNNs can be used as the core stochastic components of coupled networks that allow successful factorization of integers up to 945. We demonstrate asynchronous operation and show that a single device is sufficient to solve all factorization tasks and to generate multiple solutions simultaneously.

8.
Cureus ; 16(9): e69608, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-39308843

RESUMEN

The introduction of Apple Vision Pro (AVP) marks a significant milestone in the intersection of technology and healthcare, offering unique capabilities in mixed reality, which Apple terms "spatial computing." This narrative review aims to explore the various applications of AVP in medical technology, emphasizing its impact on patient care, clinical practices, medical education, and future directions. The review synthesizes findings from multiple studies and articles published between January 2023 and May 2024, highlighting AVP's potential to enhance visualization in diagnostic imaging and surgical planning, assist visually impaired patients, and revolutionize medical education through immersive learning environments. Despite its promise, challenges remain in integrating AVP into existing healthcare systems and understanding its long-term impact on patient outcomes. As research continues, AVP is poised to play a pivotal role in the future of medicine, offering a transformative tool for healthcare professionals.

9.
Front Neurosci ; 18: 1450640, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39308944

RESUMEN

This paper addresses the challenges posed by frequent memory access during simulations of large-scale spiking neural networks involving synaptic plasticity. We focus on the memory accesses performed during a common synaptic plasticity rule since this can be a significant factor limiting the efficiency of the simulations. We propose neuron models that are represented by only three state variables, which are engineered to enforce the appropriate neuronal dynamics. Additionally, memory retrieval is executed solely by fetching postsynaptic variables, promoting a contiguous memory storage and leveraging the capabilities of burst mode operations to reduce the overhead associated with each access. Different plasticity rules could be implemented despite the adopted simplifications, each leading to a distinct synaptic weight distribution (i.e., unimodal and bimodal). Moreover, our method requires fewer average memory accesses compared to a naive approach. We argue that the strategy described can speed up memory transactions and reduce latencies while maintaining a small memory footprint.

10.
Heliyon ; 10(18): e37490, 2024 Sep 30.
Artículo en Inglés | MEDLINE | ID: mdl-39309787

RESUMEN

The current society is becoming increasingly interconnected and hyper-connected. Communication networks are advancing, as well as logistics networks, or even networks for the transportation and distribution of natural resources. One of the key benefits of the evolution of these networks is to bring consumers closer to the source of a resource or service. However, this is not a straightforward task, particularly since networks near final users are usually shaped by heterogeneous nodes, sometimes even in very dense scenarios, which may demand or offer a resource at any given moment. In this paper, we present DEN2NE, a novel algorithm designed for the automatic distribution and reallocation of resources in distributed environments. The algorithm has been implemented with six different criteria in order to adapt it to the specific use case under consideration. The results obtained from DEN2DE are promising, owing to its adaptability and its average execution time, which follows a linear distribution in relation to the topology size.

11.
J Intensive Med ; 4(4): 468-477, 2024 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-39310065

RESUMEN

This study investigates the use of computational frameworks for sepsis. We consider two dimensions for investigation - early diagnosis of sepsis (EDS) and mortality prediction rate for sepsis patients (MPS). We concentrate on the clinical parameters on which sepsis diagnosis and prognosis are currently done, including customized treatment plans based on historical data of the patient. We identify the most notable literature that uses computational models to address EDS and MPS based on those clinical parameters. In addition to the review of the computational models built upon the clinical parameters, we also provide details regarding the popular publicly available data sources. We provide brief reviews for each model in terms of prior art and present an analysis of their results, as claimed by the respective authors. With respect to the use of machine learning models, we have provided avenues for model analysis in terms of model selection, model validation, model interpretation, and model comparison. We further present the challenges and limitations of the use of computational models, providing future research directions. This study intends to serve as a benchmark for first-hand impressions on the use of computational models for EDS and MPS of sepsis, along with the details regarding which model has been the most promising to date. We have provided details regarding all the ML models that have been used to date for EDS and MPS of sepsis.

12.
Adv Mater ; : e2409406, 2024 Sep 24.
Artículo en Inglés | MEDLINE | ID: mdl-39318076

RESUMEN

High-performance semiconductor devices capable of multiple functions are pivotal in meeting the challenges of miniaturization and integration in advanced technologies. Despite the inherent difficulties of incorporating dual functionality within a single device, a high-performance, dual-mode device is reported. This device integrates an ultra-thin Al2O3 passivation layer with a PbS/Si hybrid heterojunction, which can simultaneously enable optoelectronic detection and neuromorphic operation. In mode 1, the device efficiently separates photo-generated electron-hole pairs, exhibiting an ultra-wide spectral response from ultraviolet (265 nm) to near-infrared (1650 nm) wavelengths. It also reproduces high-quality images of 256 × 256 pixels, achieving a Q-value as low as 0.00437 µW cm- 2 at a light intensity of 8.58 µW cm- 2. Meanwhile, when in mode 2, the as-assembled device with typical persistent photoconductivity (PPC) behavior can act as a neuromorphic device, which can achieve 96.5% accuracy in classifying standard digits underscoring its efficacy in temporal information processing. It is believed that the present dual-function devices potentially advance the multifunctionality and miniaturization of chips for intelligence applications.

13.
ACS Nano ; 18(39): 27009-27015, 2024 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-39288273

RESUMEN

Magnetic tunneling junctions (MTJs) lie in the core of magnetic random access memory, holding promise in integrating memory and computing to reduce hardware complexity, transition latency, and power consumption. However, traditional MTJs are insensitive to light, limiting their functionality in in-memory sensing─a crucial component for machine vision systems in artificial intelligence applications. Herein, the convergence of magnetic memory with optical sensing capabilities is achieved in the all-two-dimensional (2D) magnetic junction Fe3GaTe2/WSe2/Fe3GaTe2, which combines 2D magnetism and optoelectronic properties. The clean intrinsic band gap and prominent photoresponse of interlayer WSe2 endow the tunneling barrier with optical tunability. The on-off states of junctions and the magnetoresistance can be flexibly controlled by the intensity of the optical signal at room temperature. Based on the optical-tunable magnetoresistance in all-2D magnetic junctions, a machine vision system with the architecture of in-memory sensing and computing is constructed, which possesses high performance in image recognition. Our work exhibits the advantages of 2D magneto-electronic devices and extends the application scenarios of magnetic memory devices in artificial intelligence.

14.
Sci Rep ; 14(1): 22249, 2024 Sep 27.
Artículo en Inglés | MEDLINE | ID: mdl-39333218

RESUMEN

The rotary motor plays a pivotal role in various motion execution mechanisms. However, an inherent issue arises during the initial installation of the encoder grating, namely, eccentricity between the centers of the encoder grating and motor shaft. This eccentricity substantially affects the accuracy of motor angle measurements. To address this challenge, we proposed a precision encoder grating mounting system that automates the encoder grating mounting process. The proposed system mainly comprises a near-sensor detector and a push rod. With the use of a near-sensor approach, the detector captures rotating encoder grating images, and the eccentricity is computed in real-time. This approach substantially reduces the time delays in image data transmission, thereby enhancing the speed and accuracy of eccentricity calculation. The major contribution of this article is a method for real-time eccentricity calculation that leverages an edge processor within the detector and an edge-vision baseline detection algorithm. This method enables real-time determination of the eccentricity and eccentricity angle of the encoder grating. Leveraging the obtained eccentricity and eccentricity angle data, the position of the encoder grating can be automatically adjusted by the push rod. In the experimental results, the detector can obtain the eccentricity and eccentricity angle of the encoder grating within 2.8 s. The system efficiently and precisely completes a encoder grating mounting task in average 25.1 s, and the average eccentricity after encoder grating mounting is 3.8 µm.

15.
Front Optoelectron ; 17(1): 33, 2024 Sep 29.
Artículo en Inglés | MEDLINE | ID: mdl-39342550

RESUMEN

In recent years, quantum computing has made significant strides, particularly in light-based technology. The introduction of quantum photonic chips has ushered in an era marked by scalability, stability, and cost-effectiveness, paving the way for innovative possibilities within compact footprints. This article provides a comprehensive exploration of photonic quantum computing, covering key aspects such as encoding information in photons, the merits of photonic qubits, and essential photonic device components including light squeezers, quantum light sources, interferometers, photodetectors, and waveguides. The article also examines photonic quantum communication and internet, and its implications for secure systems, detailing implementations such as quantum key distribution and long-distance communication. Emerging trends in quantum communication and essential reconfigurable elements for advancing photonic quantum internet are discussed. The review further navigates the path towards establishing scalable and fault-tolerant photonic quantum computers, highlighting quantum computational advantages achieved using photons. Additionally, the discussion extends to programmable photonic circuits, integrated photonics and transformative applications. Lastly, the review addresses prospects, implications, and challenges in photonic quantum computing, offering valuable insights into current advancements and promising future directions in this technology.

16.
Sci Rep ; 14(1): 22622, 2024 Sep 30.
Artículo en Inglés | MEDLINE | ID: mdl-39349932

RESUMEN

With the proliferation of services and the vast amount of data produced by the Internet, numerous services with comparable functionalities but varying Quality of Service (QoS) attributes are potential candidates for meeting user needs. Consequently, the selection of the most suitable services has become increasingly challenging. To address this issue, a synthesis of multiple services is conducted through a composition process to create more sophisticated services. In recent years, there has been a growing interest in QoS uncertainty, given its potential impact on determining an optimal composite service, where each service is characterized by multiple QoS properties (e.g., response time and cost) that are frequently subject to change primarily due to environmental factors. Here, we introduce a novel approach that depends on the Multi-Agent Whale Optimization Algorithm (MA-WOA) for web service composition problem. Our proposed algorithm utilizes a multi-agent system for the representation and control of potential services, utilizing MA-WOA to identify the optimal composition that meets the user's requirements. It accounts for multiple quality factors and employs a weighted aggregation function to combine them into a cohesive fitness function. The efficiency of the suggested method is evaluated using a real and artificial web service composition dataset (comprising a total of 52,000 web services), with results indicating its superiority over other state-of-the-art methods in terms of composition quality and computational effectiveness. Therefore, the proposed strategy presents a feasible and effective solution to the web service composition challenge, representing a significant advancement in the field of service-oriented computing.

17.
Sci Rep ; 14(1): 22703, 2024 Sep 30.
Artículo en Inglés | MEDLINE | ID: mdl-39349958

RESUMEN

Developing collaborative research platforms for quantum bit control is crucial for driving innovation in the field, as they enable the exchange of ideas, data, and implementation to achieve more impactful outcomes. Furthermore, considering the high costs associated with quantum experimental setups, collaborative environments are vital for maximizing resource utilization efficiently. However, the lack of dedicated data management platforms presents a significant obstacle to progress, highlighting the necessity for essential assistive tools tailored for this purpose. Current qubit control systems are unable to handle complicated management of extensive calibration data and do not support effectively visualizing intricate quantum experiment outcomes. In this paper, we introduce Qubit Control Storage and Visualization (QubiCSV), a platform specifically designed to meet the demands of quantum computing research, focusing on the storage and analysis of calibration and characterization data in qubit control systems. As an open-source tool, QubiCSV facilitates efficient data management of quantum computing, providing data versioning capabilities for data storage and allowing researchers and programmers to interact with qubits in real time. The insightful visualization are developed to interpret complex quantum experiments and optimize qubit performance. QubiCSV not only streamlines the handling of qubit control system data but also improves the user experience with intuitive visualization features, making it a valuable asset for researchers in the quantum computing domain.

18.
Front Artif Intell ; 7: 1436350, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39268193

RESUMEN

The developments in conversational AI raised urgent questions about the future direction of many aspects of society, including computing education. The first reactions to the fast-paced evolution of conversational agents were varied: Some announced "the end of programming," while others considered this "premature obituary of programming." Some adopted a defensive approach to detecting the use of conversational AI and avoiding an increase in plagiarism, while others questioned, "So what if ChatGPT wrote it?" Nevertheless, questions arise about whether computing education in its current form will still be relevant and fit for purpose in the era of conversational AI. Recognizing these diverse reactions to the advent of conversational AI, this paper aims to contribute to the ongoing discourse by exploring the current state through three perspectives in a dedicated literature review: adoption of conversational AI in (1) software engineering education specifically and (2) computing education in general, and (3) a comparison with software engineering practice. Our results show a gap between software engineering practice and higher education in the pace of adoption and the areas of use and generally identify preliminary research on student experience, teaching, and learning tools for software engineering.

19.
Cureus ; 16(8): e66779, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39268273

RESUMEN

The integration of fog computing into healthcare promises significant advancements in real-time data analytics and patient care by decentralizing data processing closer to the source. This shift, however, introduces complex regulatory, privacy, and security challenges that are not adequately addressed by existing frameworks designed for centralized systems. The distributed nature of fog computing complicates the uniform application of security measures and compliance with diverse international regulations, raising concerns about data privacy, security vulnerabilities, and legal accountability. This review explores these challenges in depth, discussing the implications of fog computing's decentralized architecture for data privacy, the difficulties in achieving consistent security across dispersed nodes, and the complexities of ensuring compliance in multi-jurisdictional environments. It also examines specific regulatory frameworks, including Health Insurance Portability and Accountability (HIPAA) in the United States, General Data Protection Regulation (GDPR) in the European Union, and emerging laws in Asia and Brazil, highlighting the gaps and the need for regulatory evolution to better accommodate the nuances of fog computing. The review advocates for a proactive regulatory approach, emphasizing the development of specific guidelines, international collaboration, and public-private partnerships to enhance compliance and support innovation. By embedding privacy and security by design and leveraging advanced technologies, healthcare providers can navigate the regulatory landscape effectively, ensuring that fog computing realizes its full potential as a transformative healthcare technology without compromising patient trust or data integrity.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA