ABSTRACT
T cell epitopes presented on the surface of mammalian cells are subjected to a complex network of antigen processing and presentation. Among them, C-terminal antigen processing constitutes one of the main bottlenecks for the generation of epitopes, as it defines the C-terminal end of the final epitope and delimits the peptidome that will be presented downstream. Previously (Amengual-Rigo and Guallar, Sci Rep 111(11):1-8, 2021), we demonstrated that NetCleave stands out as one of the best algorithms for the prediction of C-terminal processing, which in its turn can be crucial to design peptide-based vaccination strategies. In this chapter, we provide a pipeline to exploit the full capabilities of NetCleave, an open-source and retrainable algorithm for predicting the C-terminal antigen processing for the MHC-I and MHC-II pathways.
Subject(s)
Antigen Presentation , Epitopes, T-Lymphocyte , Animals , Algorithms , Mammals/metabolismABSTRACT
The high accuracy and dynamic performance of parallel robots (PRs) make them suitable to ensure safe operation in human-robot interaction. However, these advantages come at the expense of a reduced workspace and the possible appearance of type II singularities. The latter is due to the loss of control of the PR and requires further analysis to keep the stiffness of the PR even after a singular configuration is reached. All or a subset of the limbs could be responsible for a type II singularity, and they can be detected by using the angle between two output twist screws (OTSs). However, this angle has not been applied in control because it requires an accurate measure of the pose of the PR. This paper proposes a new hybrid controller to release a 4-DOF PR from a type II singularity based on a real time vision system. The vision system data are used to automatically readapt the configuration of the PR by moving the limbs identified by the angle between two OTSs. This controller is intended for a knee rehabilitation PR, and the results show how this release is accomplished with smooth controlled movements where the patient's safety is not compromised.
Subject(s)
Robotics , Humans , Knee Joint , Vision, OcularABSTRACT
This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.