Your browser doesn't support javascript.
loading
Montrer: 20 | 50 | 100
Résultats 1 - 20 de 682
Filtre
1.
Acta bioquím. clín. latinoam ; 58(1): 4-4, mar. 2024. graf
Article Dans Espagnol | LILACS-Express | LILACS | ID: biblio-1556653

Résumé

Resumen La sífilis es una de las infecciones de transmisión sexual con mayor incidencia en la Argentina. Para su diagnóstico, el Ministerio de Salud de la Nación avala distintos algoritmos, entre ellos, el algoritmo tradicional y el reverso. En el algoritmo tradicional, la VDRL constituye la prueba de screening y los resultados positivos se confirman con la prueba treponémica de aglutinación de partículas (TPPA). El algoritmo reverso con un test rápido, avalado más recientemente, consiste en la realización de un test rápido treponémico como screening y posterior VDRL en las muestras que resulten positivas. Se realizó una comparación entre ambos algoritmos para evaluar si era factible y conveniente la implementación del algoritmo reverso con un test rápido en el laboratorio del H.I.G.A. Dr. Oscar Alende. El objetivo fue determinar la concordancia entre el algoritmo tradicional, utilizado actualmente en la institución (VDRL seguido por TPPA), y el nuevo algoritmo propuesto (test rápido treponémico Alere Determine Syphilis TP seguido por VDRL-USR). Para ello se realizó un estudio prospectivo de desempeño de métodos cualitativos. Se realizó VDRL-USR, TPPA y test rápido Alere Determine Syphilis TP en muestras de 580 pacientes, de los cuales 558 cumplieron con los criterios de inclusión. Se obtuvieron 51 muestras con resultados positivos y 507 con resultados negativos para el diagnóstico de sífilis por ambos algoritmos, con un porcentaje de concordancia global del 100%, lo cual indica que podría reemplazarse el algoritmo tradicional por el reverso en aquellas situaciones que lo requieran en la población estudiada.


Abstract Syphilis is one of the sexually transmitted infections with the highest incidence in Argentina. For its diagnosis, the Ministry of Health of the Nation endorses different algorithms, among them, the traditional algorithm and the reverse. In the traditional algorithm, VDRL constitutes the screening test and positive results are confirmed with particle agglutination assay TPPA. The reverse algorithm with rapid test, endorsed more recently, consists of performing a rapid treponemal test as screening and subsequent VDRL in the samples that are positive. A comparison was made between both algorithms to evaluate if the implementation of the reverse algorithm with rapid test in Dr. Oscar Alende Hospital would be feasible and convenient. The objective of this work was to determine the concordance between the traditional algorithm, currently used in the institution (VDRL followed by TPPA), and the new algorithm proposed (rapid treponemal test Alere Determine Syphilis TP followed by VDRL-USR). For that purpose, a prospective study of the performance of qualitative methods was carried out. VDRL-USR, TPPA and Alere Determine Syphilis TP Rapid Test were performed on samples from 580 patients, of which 558 met the inclusion criteria. A total of 51 samples with positive results and 507 with negative results for the diagnosis of syphilis were obtained by both algorithms, with an overall concordance percentage of 100%, which indicates that the traditional algorithm could be replaced by the reverse in those situations that require it in the studied population.


Resumo A sífilis é uma das infecções sexualmente transmissíveis com maior incidência na Argentina. Para o seu diagnóstico, o Ministério da Saúde da Nação endossa diversos algoritmos, incluindo o algoritmo tradicional e o reverso. No algoritmo tradicional, o VDRL constitui o teste de triagem e os resultados positivos são confirmados com o teste treponêmico de aglutinação de partículas (TPPA). O algoritmo reverso com teste rápido, endossado mais recentemente, consiste na realização de um teste rápido treponêmico como triagem e posterior VDRL nas amostras positivas. Foi feita uma comparação entre os dois algoritmos para avaliar se a implementação do algoritmo reverso com um teste rápido no laboratório H.I.G.A. Dr. Óscar Alende era viável e conveniente. O objetivo foi determinar a concordância entre o algoritmo tradicional, atualmente utilizado na instituição (VDRL seguido de TPPA), e o novo algoritmo proposto (teste rápido treponêmico Alere Determine Syphilis TP seguido de VDRL-USR). Para tanto, foi realizado um estudo prospectivo de desempenho de métodos qualitativos. O VDRL- -USR, o TPPA e o teste rápido Alere Determine Syphilis TP foram realizados em amostras de 580 pacientes, dos quais 558 preencheram os critérios de inclusão. Foram obtidas 51 amostras com resultados positivos e 507 com resultados negativos para o diagnóstico de sífilis por ambos os algoritmos, com um percentual de concordância global de 100%, o que indica que o algoritmo tradicional poderia ser substituído pelo reverso nas situações que o exigissem na população estudada.

2.
Rev. argent. cir ; 116(1): 24-31, mar. 2024. graf
Article Dans Espagnol | LILACS-Express | LILACS | ID: biblio-1559262

Résumé

RESUMEN Antecedentes: los colgajos perforantes perimamarios son de gran utilidad en la reconstrucción mamaria inmediata en cirugía conservadora. Objetivo: describir los resultados del empleo de un algoritmo sobre colgajos perforantes perimamarios en la reconstrucción mamaria inmediata después de cirugía conservadora por cáncer de mama. Material y métodos: se llevó a cabo un estudio retrospectivo descriptivo. Se revisaron las historias clínicas de las pacientes operadas entre enero de 2020 y diciembre de 2022 por carcinoma de mama con cirugía conservadora y que requirieron reconstrucción con colgajos perimamarios. Las indicaciones incluyeron déficit de volumen, defecto de contorno y asimetría. Se evaluó el pedículo vascular del colgajo mediante Doppler color en todos los casos, lo que permitió seguir un algoritmo para la selección de la mejor opción de colgajo. Resultados: se realizaron 20 colgajos en 19 pacientes. Promedio de edad: 52 años ± 11 (rango 30-76). No existieron complicaciones intraoperatorias. Una paciente requirió reoperación por compresión del pedículo vascular del colgajo por hematoma, con la pérdida parcial, y otro colgajo sufrió epidermólisis superficial. No hubo pérdidas totales de ningún colgajo. Todas recibieron radioterapia posoperatoria y no experimentaron pérdida de volumen ni retracciones. Con un promedio de seguimiento de 15 meses, las pacientes valoraron los resultados a 6 meses como excelente en 7, bueno en 11 y regular en 2. Conclusión: la selección de colgajos perforantes locales para corregir defectos mamarios después de cirugía conservadora, mediante el examen con Doppler color preoperatorio para la identificación del pedículo vascular y un algoritmo específico, permitió obtener resultados estéticos satisfactorios sin requerir elementos aloplásticos ni revisiones posteriores.


ABSTRACT Background: Chest wall perforator flaps are a good option for immediate breast reconstruction after conservative surgery. Objective: The aim of this study was to describe the clinical results of an algorithm for using chest wall perforator flaps for breast reconstruction after breast-conserving surgery for breast cancer. Material and methods: We conducted a descriptive and retrospective study. The information was retrieved from the medical records of the patients diagnosed with breast cancer who underwent breast-conserving surgery and required reconstruction using chest wall perforator flaps between January 2020 and March 2022. The indications included volume deficit, contour defect and asymmetry. The vascular pedicle of the flap was evaluated by color Doppler ultrasound in all cases, which allowed us to follow an algorithm for selecting the best flap option. Results: Twenty flaps were made in 19 patients. Mean age: 52 years ± 11 (range 30-76). There were no intraoperative complications. One patient required reoperation due to a hematoma with compression of the vascular pedicle of the flap with partial flap loss, and another flap presented superficial epidermolysis. There were no cases of complete flap loss. All the patients underwent postoperative radiation therapy without loss of volume or retractions. Mean follow-up was 15 months. At 6 months, patients rated the results as excellent, good, and fair in 7, 11, and 2 cases, respectively. Conclusion: The selection of local perforator flaps to correct breast defects after conservative surgery, using preoperative color Doppler ultrasound to identify the vascular pedicle and a specific algorithm, allowed us to obtain satisfactory aesthetic results without the need for alloplastic elements or subsequent revisions.

3.
Shanghai Journal of Preventive Medicine ; (12): 98-103, 2024.
Article Dans Chinois | WPRIM | ID: wpr-1012662

Résumé

ObjectiveTo elucidate the principles and methods of the Bayesian probabilistic linkage model, and to demonstrate the effect of applying the model in linking birth and death data. MethodsThrough the Shanghai birth and death registration system, data of 199 025 infants born in 2017 and 1 512 infants who died in 2017 and 2018 were collected. After cleaning the data, the data were divided into monthly blocks and fully linked. The Jaro-Winkler algorithm and Euclidean distance were employed to measure the similarity of fields for matching. A Bayesian probabilistic linkage model was constructed and the linking effect was evaluated using a confusion matrix. ResultsUsing the Bayesian probabilistic linkage model, the birth and death data of infants were effectively linked, revealing that 36.71% of infants who died in Shanghai were born outside the city, and the probability of infant death was 2.6‰. The confusion matrix of the test set showed a recall rate of 0.86, precision of 0.76, and an F-score of 0.81. ConclusionThe practical application of Bayesian probabilistic linkage demonstrates a good model performance, enabling the establishment of birth-death cohorts that more accurately reflect the true levels of infant mortality. Utilizing this technique to integrate data from different departments can effectively improve research efficiency in the field of public health.

4.
China Pharmacy ; (12): 327-332, 2024.
Article Dans Chinois | WPRIM | ID: wpr-1006618

Résumé

OBJECTIVE To optimize ethanol extraction process of Yihuang powder. METHODS An orthogonal experiment was designed by reflux extraction with ethanol volume fraction, liquid-to-material ratio, and extraction time as investigation factors. The parameters used were the contents of hesperidin, nobiletin, tangeretin, gallic acid, chebulagic acid, chebulinic acid, liquiritin, glycyrrhizin, eugenol, and the paste-forming rate. The analytic hierarchy process (AHP) was used to calculate the comprehensive score. The optimal ethanol extraction process parameters of Yihuang powder were determined by verifying the results predicted by orthogonal experiment and genetic algorithm (GA)-back propagation neural network (BP neural network). RESULTS The optimal ethanol extraction process parameters, as optimized by orthogonal experiment, were as follows: ethanol volume fraction of 60%, liquid-solid ratio of 14∶1 (mL/g), extraction time of 90 min, and extraction for 2 times. The comprehensive score obtained by verification was 79.19. Meanwhile, the optimal ethanol extraction process parameters, optimized by GA-BP neural network, were ethanol volume fraction of 65%, liquid-solid ratio of 14∶1 (mL/g ), extraction time of 60 min, and extraction for 2 times. The comprehensive score obtained by verification was 85.30, higher than the results obtained from orthogonal experiment. CONCLUSIONS The optimization method of orthogonal experiment combined with GA-BP neural network is superior to the traditional orthogonal experiment optimization method. The optimized ethanol extraction process of Yihuang powder is stable and reliable.

5.
Acta bioquím. clín. latinoam ; 57(4): 4-4, dic. 2023. graf
Article Dans Espagnol | LILACS-Express | LILACS | ID: biblio-1556642

Résumé

Resumen El objetivo del trabajo fue comparar el desempeño del tamizaje treponémico y no treponémico en dos períodos próximos en el tiempo en donantes de sangre y analizar la asociación de la intensidad de la señal de la prueba quimioluminiscente (QL) con la reactividad del enzimoinmunoanálisis (ELISA) y la reagina plasmática rápida (RPR). Se realizó el tamizaje con pruebas treponémicas. Se analizó la distribución de los valores de señal/valor de corte (S/CO) obtenidos por QL en función del ELISA y RPR y se evaluó su asociación entre dos grupos de resultados de laboratorio (1) ELISA+/RPR+ y (2): ELISA+/RPR-. Se procesaron 76 794 donaciones voluntarias. Al comparar las medianas entre grupos, se encontró que la relación S/CO de QL fue significativamente mayor para los donantes del Grupo 1 (19,5 vs. 8,10; p<0,001). Se concluye que la intensidad de la señal de la prueba cualitativa QL estaría asociada con la reactividad de la RPR y guardaría relación con el curso de la infección.


Abstract The objective of this work was to compare the performance of treponemal and non-treponemal screening in two periods close in time in blood donors and to analyse the association of the signal strength (signal-to-cutoff, S/ CO) of the chemiluminescent immunoassay (CIA) with the reactivity of the enzymeimmunoassay (EIA) and the rapid plasma reagin (RPR). Donors were screened with treponemal tests. The distribution of the S/CO values obtained by CIA was analysed based on EIA and RPR, and its association was evaluated. Group 1: ELISA+/RPR+ donors and Group 2: ELISA+/RPR-. A total of 76,794 voluntary donations were processed. Comparing the means between groups, it was found that the CIA S/CO ratio was significantly higher for Group 1 donors (19.5 vs. 8.10; p<0.001). In conclusion, the signal strength of the qualitative CIA test would be associated with the reactivity of the RPR and would be related to the course of the infection.


Resumo O objetivo do trabalho foi comparar o desempenho da triagem treponêmica e não treponêmica em dois períodos próximos no tempo em doadores de sangue e analisar a associação da intensidade do sinal da prova quimioluminescente (QL) com a reatividade da análise imunoenzimática (ELISA) e a reagina plasmática rápida (RPR). Foi realizada uma triagem com provas treponêmicas. Foi analisada a distribuição dos valores de sinal/valor de corte (S/CO) obtidos por QL em função da ELISA e RPR, e também a associação entre dois grupos de resultados laboratoriais (1): ELISA+/RPR+ e (2) ELISA+/ RPR-). Foram procesadas 76 794 doações de sangue voluntárias. Ao comparar as medianas entre os grupos, obteve-se que a relação S/CO de QL foi significativamente maior para os doadores de sangue do grupo 1 (19,5 vs. 8,10; p<0,001). Em conclusão, a intensidade do sinal da prova qualitativa QL estaria associada com a reatividade da RPR e guardaria relação com o curso da infecção.

6.
Estud. pesqui. psicol. (Impr.) ; 23(4): 1486-1505, dez. 2023.
Article Dans Portugais | LILACS, INDEXPSI | ID: biblio-1538191

Résumé

O algoritmo digital permitiu o manejo de dados dos usuários da web pelos conglomerados informacionais. De forma discreta e personalizada, a nova forma de governamentalidade coleta, organiza, permuta e devolve os dados ao próprio indivíduo na forma de mais informações. Cada vez mais, esbarra na dimensão singular, tocando o campo do gozo via proliferação de objetos a que, na teoria lacaniana dos discursos, assume a dupla função de perda e de incessante tentativa de suplementação de gozo. Com o incremento informacional, o objeto chega ao ápice social e o digital alcança patamar discursivo. Inserindo-se no mesmo nicho do saber, a informação digital se aproveita da divisão subjetiva, deixando pouco espaço para que o sujeito possa lidar com a entropia de seu gozo via desejo. Se a neguentropia é o atributo do saber que limita a dispersão de gozo, na informação tratada e retornada algoritmicamente tal processo sofre uma aceleração, agindo diretamente sobre a economia dos afetos. Com prejuízo para o sujeito, resta uma experiência de gozo cada vez mais direta, crua, menos mediatizada pelo saber e pelo Outro.


The digital algorithm has allowed the management of data from web users by informational conglomerates. In a discreet and personalized way, the new form of governmentality collects, organizes, exchanges and returns data to the individual in the form of more information. More and more, it comes up against the singular dimension, touching the field of jouissance via the proliferation of objects a which, in the Lacanian theory of discourses, assumes the double function of loss and an incessant attempt to supplement jouissance. With the increase in information, the object reaches the social apex and the digital reaches a discursive level. Inserting itself in the same niche of know [savoir], digital information takes advantage of the subjective division, leaving little space for the subject to deal with the entropy of his jouissance via desire. If negentropy is the attribute of savoir that limits the dispersion of jouissance, in the information processed and returned algorithmically this process is accelerated, acting directly on the economy of affections. To the detriment of the subject, what remains is an experience of jouissance that is increasingly direct, raw, less mediated by savoir and the Other.


El algoritmo digital permitió la gestión de datos de los usuarios de la web por parte de conglomerados informativos. De forma discreta y personalizada, la nueva forma de gubernamentalidad recolecta, organiza, intercambia y devuelve datos al individuo en forma de más información. Cada vez más, choca con la dimensión singular, tocando el campo del goce a través de la proliferación de objetos a que, en la teoría lacaniana de los discursos, asume la doble función de pérdida y de intento incesante de complementar el goce. Con el aumento de la información, el objeto alcanza el ápice social y lo digital alcanza un nivel discursivo. Insertándose en el mismo nicho del conocimiento, la información digital aprovecha la división subjetiva, dejando poco espacio para que el sujeto gestione la entropía de su goce vía deseo. Si la negentropía es el atributo del saber que limita la dispersión del goce, en la información procesada y devuelta algorítmicamente, este proceso se acelera, actuando directamente sobre la economía de los afectos. En detrimento del sujeto, lo que queda es una experiencia de goce cada vez más directa, cruda, menos mediatizada por el saber y el Otro.


Sujets)
Théorie psychanalytique , Diffusion de l'information , Plaisir , Accès à Internet
7.
Rev. sanid. mil ; 77(3): e04, jul.-sep. 2023. tab, graf
Article Dans Espagnol | LILACS-Express | LILACS | ID: biblio-1536754

Résumé

Resumen Introducción: El síndrome Stevens Johnson (SSJ) es una dermatosis potencialmente fatal caracterizada por una extensa necrosis epidérmica y de mucosas que se acompaña de ataque al estado general, y junto con la necrólisis epidérmica tóxica (NET) se consideran reacciones de hipersensibilidad tipo IV, relacionadas con ciertos fármacos en 60% de los casos, siendo uno de los diagnósticos pocos frecuentes, pero con una alta mortalidad hasta del 40%. Caso clínico: El siguiente caso clínico es un masculino de 34 años de edad que inició un cuadro de eritema generalizado inmediatamente tras la administración del medicamento trimetoprima/sulfametoxazol. Se le solicitó un hemograma mostrando leucocitosis, neutrofilia, VSG elevada, PCR elevada, IgE elevada, y tras el interrogatorio clínico se realiza el algoritmo ALDEN dando positivo con 10 puntos asociado al medicamento previamente dicho. Por lo tanto se le inicia tratamiento con metilprednisolona, difenhidramina, inmunoglobulina humana intravenosa y un plan terapéutico cutáneo, dando como resultado una mejoría clínica, evitando complicaciones y secuelas, hasta el día de su egreso. A manera de conclusión, se requiere un manejo multidisciplinario para atender las manifestaciones clínicas del inmunoglobulina humana intravenosa.


Abstract Introduction: Stevens Johnson Syndrome (SJS) is a potentially fatal dermatosis characterized by extensive epidermal and mucosal necrosis accompanied by an attack on the general condition, which together with Toxic Epidermal Necrolysis (TEN) are considered type IV hypersensitivity reactions, related to certain drugs in 60% of cases, being one of the rare diagnoses, but with a high mortality of up to 40%. Case report: The following clinical case is a 34 year old male who started a generalized erythema picture immediately after administration of the medication trimethoprim/sulfamethoxazole, for which a complete blood count was requested showing leukocytosis, neutrophilia, elevated ESR, elevated PCR, elevated IgE, and after the clinical questioning, the ALDEN algorithm was performed, giving positive with 10 points associated with the previously mentioned medication, for which treatment was started with methylprednisolone, diphenhydramine, intravenous human immunoglobulin and a skin therapeutic plan, resulting in clinical improvement, avoiding complications and sequelae, until the day of discharge. In conclusion, a multidisciplinary management is required to attend to the clinical manifestations of the patient, helping him to a quick and effective recovery.

8.
Rev. medica electron ; 45(4)ago. 2023.
Article Dans Espagnol | LILACS-Express | LILACS | ID: biblio-1515362

Résumé

Introducción: En los últimos años se han producido cambios profundos en el manejo del paciente politraumatizado. A la vez, se han desarrollado nuevos conceptos en relación con las posibles complicaciones, los esquemas de tratamiento y las escalas pronósticas, así como en la identificación de elementos relacionados con su evolución para determinar, de manera precoz, las lesiones que amenazan la vida y precisan de un control quirúrgico inmediato o de intervencionismo radiológico. La tomografía computarizada multicorte y otros estudios imagenológicos permiten la obtención de imágenes de las estructuras corporales por planos, y proporcionan información muy detallada y útil para el diagnóstico; sin embargo, no existe consenso a la hora de indicar uno u otro en el trauma. Objetivo: Elaborar un algoritmo para la indicación eficiente de los estudios imagenológicos en el paciente politraumatizado. Materiales y métodos: Se ejecutó una investigación de desarrollo tecnológico, donde el universo de trabajo estuvo conformado por 43 pacientes con criterio de politrauma -que necesitaron estudios imagenológicos-, ingresados en el Hospital Universitario Clínico Quirúrgico Comandante Faustino Pérez Hernández, en el período comprendido entre marzo de 2020 y marzo de 2021. Resultados: Se elaboró un algoritmo para estandarizar la indicación de estudios imagenológicos en el trauma. Este trabajo considera la seguridad y protección del paciente, y el cuidado de la vida útil del equipo de tomografía axial computarizada. Conclusión: El algoritmo diseñado viabiliza la toma de decisiones respecto al uso de recursos imagenológicos en la atención a los pacientes politraumatizados.


Introduction: In recent years there have been profound changes in the management of poly-traumatized patients. At the same time, new concepts have been developed in relation to possible complications, treatment schemes and prognostic scales, as well as in the identification of elements related with its evolution to early determine the life-threatening lesions and require immediate surgical control or radiological intervention. Multislice computed tomography and other imaging studies allow obtaining images of body structures by planes and provide very detailed and useful information for the diagnosis; nevertheless there is no consensus when it comes to indicating one or the other in trauma. Objective: To develop an algorithm for the efficient indication of imaging study in the poly-traumatized patient. Materials and methods: A technological development research was carried out, where the working universe was made up by 43 patients with polytrauma criteria -who needed imaging studies- admitted to the Clinical Surgical University Hospital Faustino Perez, in the period between March 2020 and March 2021. Results: An algorithm was developed for standardizing the indication of imaging studies in trauma. This work considers the safety and protection of the patient, and the care of the life of the computed axial tomography equipment. Conclusions: The developed algorithm enables decision-making regarding the use of imaging resources in the care of poly-traumatized patients.

9.
Odovtos (En línea) ; 25(2)ago. 2023.
Article Dans Anglais | LILACS-Express | LILACS | ID: biblio-1448745

Résumé

Three-dimensional cone-beam computed tomography (CBCT) has an important role in the detection of vertical root fractures (VRFs). The effect of artifact generation by high-density objects like dental implants on image quality was well documented. This study aimed to assess the effect of tooth-implant distance and the application of metal artifact reduction (MAR) algorithm on the detection of VRFs on CBCT scans. This study was conducted on 20 endodontically treated single-rooted teeth. VRFs were induced in 10 teeth, while the other 10 remained intact. The implant was inserted in the right second premolar socket area, and two teeth were inserted in right canine and right first premolar sockets area randomly and underwent CBCT with and without the application of MAR algorithm. SPSS 21 was used to analyze the results (alpha=0.05). According to the findings of this study, all four variables of sensitivity, specificity, accuracy, and positive predictive values in diagnosis were higher in cases without MAR software at both close(roots in first premolar sockets) and far distances (roots in canine sockets) from the implant. However, the highest rate of diagnosis accuracy of the first and second radiologists was in the far distance group from the implant without MAR, and the lowest rate of diagnosis accuracy in the first and second radiologists was in the close distance to the implant. Applying MAR algorithm had no positive effect on detection of VRFs on CBCT scans in both close and distant scenarios.


La tomografía computarizada de haz cónico tridimensional (CBCT) tiene un papel importante en la detección de fracturas radiculares verticales (VRF). El efecto de la generación de artefactos por objetos de alta densidad como los implantes dentales en la calidad de la imagen está bien documentado. Este estudio tuvo como objetivo evaluar el efecto de la distancia entre el diente y el implante y la aplicación del algoritmo de reducción de artefactos metálicos (MAR) en la detección de VRF en escaneos CBCT. Este estudio se realizó en 20 dientes uniradiculares tratados endodónticamente. Se indujeron VRF en 10 dientes, mientras que los otros 10 permanecieron intactos. El implante se insertó en el área del alveolo del segundo premolar derecho, y dos dientes se insertaron en el canino derecho y en el área del alvéolo del primer premolar derecho al azar y se sometieron a CBCT con y sin la aplicación del algoritmo MAR. Se utilizó SPSS 21 para analizar los resultados (alfa=0,05). De acuerdo con los hallazgos de este estudio, las cuatro variables de sensibilidad, especificidad, precisión y valores predictivos positivos en el diagnóstico fueron más altas en los casos sin el software MAR tanto en distancias cercanas (raíces en las cavidades de los primeros premolares) como lejanas (raíces en las cavidades de los caninos) del implante. Sin embargo, la tasa más alta de precisión diagnóstica del primer y segundo radiólogo fue en el grupo de mayor distancia al implante sin MAR, y la tasa más baja de precisión diagnóstica en el primer y segundo radiólogo fue en la distancia cercana al implante. La aplicación del algoritmo MAR no tuvo un efecto positivo en la detección de VRF en escaneos CBCT en escenarios cercanos y distantes.

10.
Arch. cardiol. Méx ; 93(2): 164-171, Apr.-Jun. 2023. tab, graf
Article Dans Anglais | LILACS-Express | LILACS | ID: biblio-1447247

Résumé

Abstract Background: In 1996 Iturralde et al. published an algorithm based on the QRS polarity to determine the location of the accessory pathways (AP), this algorithm was developed before the massive practice of invasive electrophysiology. Purpose: To validate the QRS-Polarity algorithm in a modern cohort of subjects submitted to radiofrequency catheter ablation (RFCA). Our objective was to determinate its global accuracy and its accuracy for parahisian AP. Methods: We conducted a retrospective analysis of patients with Wolff-Parkinson-White (WPW) syndrome who underwent an electrophysiological study (EPS) and RFCA. We employed the QRS-Polarity algorithm to predict the AP anatomical location and we compared this result with the real anatomic location determined in the EPS. To determine accuracy, the Cohen's kappa coefficient (k) and the Pearson correlation coefficient were used. Results: A total of 364 patients were included (mean age 30 years, 57% male). The global k score was 0.78 and the Pearson's coefficient was 0.90. The accuracy for each zone was also evaluated, the best correlation was for the left lateral AP (k of 0.97). There were 26 patients with a parahisian AP, who showed a great variability in the ECG features. Employing the QRS-Polarity algorithm, 34.6% patients had a correct anatomical location, 42.3% had an adjacent location and only 23% an incorrect location. Conclusion: The QRS-Polarity algorithm has a good global accuracy; its precision is high, especially for left lateral AP. This algorithm is also useful for the parahisian AP.


Resumen Antecedentes: En 1996 Iturralde y colaboradores publicaron un algoritmo basado en la polaridad del QRS para determinar la ubicación de las vías accesorias (VA), este algoritmo fue desarrollado antes de la práctica masiva de la electrofisiología invasiva. Objetivo: Validar el algoritmo de la polaridad del QRS en una cohorte moderna de sujetos sometidos a ablación con catéter por radiofrecuencia (ACRF). Nuestro objetivo fue determinar su precisión global y su precisión para las VA parahisianas. Métodos: Realizamos un análisis retrospectivo de pacientes con síndrome de Wolff-Parkinson-White (WPW) a los que se les realizó estudio electrofisiológico (EEF) y ACRF. Empleamos el algoritmo de la polaridad del QRS para predecir la ubicación anatómica de la VA y comparamos este resultado con la ubicación anatómica real determinada en el EEF. Para determinar la precisión se utilizaron el coeficiente kappa de Cohen (k) y el coeficiente de correlación de Pearson. Resultados: Se incluyeron un total de 364 pacientes (edad media 30 años, 57 % varones). La puntuación k global fue de 0,78 y el coeficiente de Pearson de 0,90. También se evaluó la precisión para cada zona, la mejor correlación fue para las VA laterales izquierdas (k de 0.97). Hubo 26 pacientes con VA parahisianas, que mostraron una gran variabilidad en las características del ECG. Empleando el algoritmo de la polaridad del QRS, el 34,6 % de los pacientes tenía una ubicación anatómica correcta, el 42,3 % tenía una ubicación adyacente y solo el 23 % una ubicación incorrecta. Conclusión: El algoritmo de la polaridad del QRS tiene una buena precisión global; su precisión es alta, especialmente para VA lateral izquierdo. Este algoritmo también es útil para la VA parahisiana.

12.
Colomb. med ; 54(1)mar. 2023.
Article Dans Anglais | LILACS-Express | LILACS | ID: biblio-1534279

Résumé

Background: Pathology reports are stored as unstructured, ungrammatical, fragmented, and abbreviated free text with linguistic variability among pathologists. For this reason, tumor information extraction requires a significant human effort. Recording data in an efficient and high-quality format is essential in implementing and establishing a hospital-based-cancer registry Objective: This study aimed to describe implementing a natural language processing algorithm for oncology pathology reports. Methods: An algorithm was developed to process oncology pathology reports in Spanish to extract 20 medical descriptors. The approach is based on the successive coincidence of regular expressions. Results: The validation was performed with 140 pathological reports. The topography identification was performed manually by humans and the algorithm in all reports. The human identified morphology in 138 reports and by the algorithm in 137. The average fuzzy matching score was 68.3 for Topography and 89.5 for Morphology. Conclusions: A preliminary algorithm validation against human extraction was performed over a small set of reports with satisfactory results. This shows that a regular-expression approach can accurately and precisely extract multiple specimen attributes from free-text Spanish pathology reports. Additionally, we developed a website to facilitate collaborative validation at a larger scale which may be helpful for future research on the subject.


Introducción: Los reportes de patología están almacenados como texto libre sin estructura, gramática, fragmentados o abreviados, con variabilidad lingüística entre patólogos. Por esta razón, la extracción de información de tumores requiere un esfuerzo humano significativo. Almacenar información en un formato eficiente y de alta calidad es esencial para implementar y establecer un registro hospitalario de cáncer. Objetivo: Este estudio busca describir la implementación de un algoritmo de Procesamiento de Lenguaje Natural para reportes de patología oncológica. Métodos: Desarrollamos un algoritmo para procesar reportes de patología oncológica en Español, con el objetivo de extraer 20 descriptores médicos. El abordaje se basa en la coincidencia sucesiva de expresiones regulares. Resultados: La validación se hizo con 140 reportes de patología. La identificación topográfica se realizó por humanos y por el algoritmo en todos los reportes. La morfología fue identificada por humanos en 138 reportes y por el algoritmo en 137. El valor de coincidencias parciales (fuzzy matches) promedio fue de 68.3 para Topografía y 89.5 para Morfología. Conclusiones: Se hizo una validación preliminar del algoritmo contra extracción humana sobre un pequeño grupo de reportes, con resultados satisfactorios. Esto muestra que múltiples atributos del espécimen pueden ser extraídos de manera precisa de texto libre de reportes de patología en Español, usando un abordaje de expresiones regulares. Adicionalmente, desarrollamos una página web para facilitar la validación colaborativa a gran escala, lo que puede ser beneficioso para futuras investigaciones en el tema.

13.
Rev. cuba. pediatr ; 952023. ilus, tab
Article Dans Espagnol | LILACS, CUMED | ID: biblio-1515282

Résumé

Introducción: La inflamación de la pleura desencadenada por bacterias y mediada por citocinas, aumenta la permeabilidad vascular y produce vasodilatación, lo cual genera desequilibrio entre la producción de líquido pleural y su capacidad de reabsorción por eficientes mecanismos fisiológicos. La condición anterior conduce al desarrollo de derrame pleural paraneumónico. Objetivo: Exponer la importancia de la correlación fisiopatológica y diagnóstica con los pilares fundamentales de actuación terapéutica en el derrame pleural paraneumónico. Métodos: Revisión en PubMed y Google Scholar de artículos publicados hasta abril de 2021 que abordaran el derrame pleural paraneumónico, su fisiopatología, elementos diagnósticos, tanto clínicos como resultados del estudio del líquido pleural, pruebas de imágenes, y estrategias terapéuticas. Análisis y síntesis de la información: El progreso de una infección pulmonar y la producción de una invasión de gérmenes al espacio pleural favorece la activación de mecanismos que conllevan al acúmulo de fluido, depósito de fibrina y formación de septos. Este proceso patológico se traduce en manifestaciones clínicas, cambios en los valores citoquímicos y resultados microbiológicos en el líquido pleural, que acompañados de signos radiológicos y ecográficos en el tórax, guían la aplicación oportuna de los pilares de tratamiento del derrame pleural paraneumónico. Conclusiones: Ante un derrame pleural paraneumónico, con tabiques o partículas en suspensión en la ecografía de tórax, hallazgo de fibrina, líquido turbio o pus en el proceder de colocación del drenaje de tórax, resulta necesario iniciar fibrinólisis intrapleural. Cuando el tratamiento con fibrinolíticos intrapleurales falla, la cirugía video-toracoscópica es el procedimiento quirúrgico de elección(AU)


Introduction: The inflammation of the pleura triggered by bacteria and mediated by cytokines, increases vascular permeability and produces vasodilation, which generates imbalance between the production of pleural fluid and its resorption capacity by efficient physiological mechanisms. The above condition leads to the development of parapneumonic pleural effusion. Objective: To expose the importance of the pathophysiological and diagnostic correlation with the fundamental pillars of therapeutic action in parapneumonic pleural effusion. Methods: Review in PubMed and Google Scholar of articles published until April 2021 that addressed parapneumonic pleural effusion, its pathophysiology, diagnostic elements, both clinical and results of the pleural fluid study, imaging tests, and therapeutic strategies. Analysis and synthesis of information: The progress of a lung infection and the production of an invasion of germs into the pleural space favors the activation of mechanisms that lead to the accumulation of fluid, fibrin deposition and formation of septa. This pathological process results in clinical manifestations, changes in cytochemical values and microbiological results in the pleural fluid, which accompanied by radiological and ultrasound signs in the chest, guide the timely application of the pillars of treatment of parapneumonic pleural effusion. Conclusions: In the event of a parapneumonic pleural effusion, with septums or particles in suspension on chest ultrasound, finding fibrin, turbid fluid or pus in the procedure of placement of the chest drain, it is necessary to initiate intrapleural fibrinolytic. When treatment with intrapleural fibrinolytics fails, video-thoracoscopic surgery is the surgical procedure of choice(AU)


Sujets)
Humains , Épanchement pleural/classification , Épanchement pleural/physiopathologie , Épanchement pleural/traitement médicamenteux , Épanchement pleural/imagerie diagnostique , Drainage/instrumentation , Antibactériens
14.
Afr. j. lab. med. (Online) ; 12(1): 1-4, 2023. figures
Article Dans Anglais | AIM | ID: biblio-1413499

Résumé

Introduction: Determining the HIV status of some individuals remains challenging due to multidimensional factors such as flaws in diagnostic systems, technological challenges, and viral diversity. This report pinpoints challenges faced by the HIV testing system in Cameroon. Case presentation: A 53-year-old male received a positive HIV result by a rapid testing algorithm in July 2016. Not convinced of his HIV status, he requested additional tests. In February 2017, he received a positive result using ImmunoComb® II HIV 1 & 2 BiSpot and Roche cobas electrochemiluminescence assays. A sample sent to France in April 2017 was positive on the Bio-Rad GenScreen™ HIV 1/2, but serotyping was indeterminate, and viral load was < 20 copies/mL. The Roche electrochemiluminescence immunoassay and INNO-LIA HIV I/II Score were negative for samples collected in 2018. A sample collected in July 2019 and tested with VIDAS® HIV Duo Ultra enzyme-linked fluorescent assay and Geenius™ HIV 1/2 Confirmatory Assay was positive, but negative with Western blot; CD4 count was 1380 cells/mm3 and HIV proviral DNA tested in France was 'target-not-detected'. Some rapid tests were still positive in 2020 and 2021. Serotyping remained indeterminate, and viral load was 'target-not-detected'. There were no self-reported exposure to HIV risk factors, and his wife was HIV-seronegative.Management and outcome: Given that the patient remained asymptomatic with no evidence of viral replication, no antiretroviral therapy was initiated. Conclusion: This case highlights the struggles faced by some individuals in confirming their HIV status and the need to update existing technologies and develop an algorithm for managing exceptional cases.

15.
Journal of Medical Biomechanics ; (6): E346-E352, 2023.
Article Dans Chinois | WPRIM | ID: wpr-987957

Résumé

Objective To investigate the effect of different optimization algorithms on accurate reconstruction of traffic accidents. Methods Non-dominated sorting genetic algorithm-II ( NSGA-II), neighborhood cultivation genetic algorithm (NCGA) and multi-objective particle swarm optimization (MOPSO) were used to optimize the multi-rigid body dynamic reconstruction of a real case. The effects of different optimization algorithms on convergence speed and optimal approximate solution were studied. The optimal initial impact parameters were simulated as boundary conditions of finite element method, and the simulated results were compared with the actual injuries. Results NCGA had a faster convergence speed and a better result in optimization process. The kinematic response of pedestrian vehicle collision reconstructed by the optimal approximate solution was consistent with the surveillance video. The prediction of craniocerebral injury was basically consistent with the cadaver examination. Conclusions The combination of optimization algorithm, rigid multibody and finite element method can complete the accurate reconstruction of traffic accidents and reduce the influence of human factors.

16.
China Journal of Chinese Materia Medica ; (24): 1132-1136, 2023.
Article Dans Chinois | WPRIM | ID: wpr-970585

Résumé

In observational studies, herbal prescriptions are usually studied in the form of "similar prescriptions". At present, the classification of prescriptions is mainly based on clinical experience judgment, but there are some problems in manual judgment, such as lack of unified criteria, labor consumption, and difficulty in verification. In the construction of a database of integrated traditional Chinese and western medicine for the treatment of coronavirus disease 2019(COVID-19), our research group tried to classify real-world herbal prescriptions using a similarity matching algorithm. The main steps include 78 target prescriptions are determined in advance; four levels of importance labeling shall be carried out for the drugs of each target prescription; the combination, format conversion, and standardization of drug names of the prescriptions to be identified in the herbal medicine database; calculate the similarity between the prescriptions to be identified and each target prescription one by one; prescription discrimination is performed based on the preset criteria; remove the name of the prescriptions with "large prescriptions cover the small". Through the similarity matching algorithm, 87.49% of the real prescriptions in the herbal medicine database of this study can be identified, which preliminarily proves that this method can complete the classification of herbal prescriptions. However, this method does not consider the influence of herbal dosage on the results, and there is no recognized standard for the weight of drug importance and criteria, so there are some limitations, which need to be further explored and improved in future research.


Sujets)
Humains , COVID-19 , Algorithmes , Bases de données factuelles , Ordonnances , Extraits de plantes
17.
Journal of Environmental and Occupational Medicine ; (12): 1115-1120, 2023.
Article Dans Chinois | WPRIM | ID: wpr-998764

Résumé

Background Identification and analysis of influencing factors of occupational injury is an important research content of feature selection. In recent years, with the rise of machine learning algorithms, feature selection combined with Boosting algorithm provides a new analysis idea to construct occupational injury prediction models. Objective To evaluate applicability of Boosting algorithm-based model in predicting severity of miners' non-fatal occupational injuries, and provide a basis for rationally predicting the severity level of miners' non-fatal occupational injuries. Methods The publicly available data of the US Mine Safety and Health Administration (MSHA) from 2001 to 2021 on metal miners' non-fatal occupational injuries were used, and the outcome variables were lost working days < 105 d (minor injury) and ≥ 105 d (serious injury). Four different feature sets were screened out by four feature selection methods including least absolute shrinkage and selection operator (Lasso) regression, stepwise regression, single factor + Lasso regression, and single factor + stepwise regression. Logistic regression, gradient boosting decision tree (GBDT), and extreme gradient boosting (XGBoost) were selected to construct prediction models by training with the four feature sets. A total of 12 prediction models of severity of miners' non-fatal occupational injuries were built and their area under the curve (AUC), sensitivity, specificity, and Youden index were calculated for model evaluation. Results According to the results of four feature selection methods, age, time of accident occurrence, total length of service, cause of injury, activities that triggered injury occurrence, body part of injury, nature of injury, and outcome of injury were identified as influencing factors of non-fatal occupational injury severity in miners. Feature set 4 was the optimal set screened out by single factor+stepwise regression and the GBDT model presented the best predictive performance in predicting the severity of non-fatal occupational injuries. The associated specificity, sensitivity, and Youden index were 0.7530, 0.9490, and 0.7020, respectively. The AUC values of logistic regression, GBDT, and XGBoost models trained by feature set 4 were 0.8526 (95%CI: 0.8387, 0.8750), 0.8640 (95%CI: 0.8474, 0.8806), and 0.8603 (95%CI: 0.8439, 0.8773), respectively, higher than the AUC values trained by feature set 2 [0.8487 (95%CI: 0.8203, 0.8669), 0.8110 (95%CI: 0.8012, 0.8344), and 0.8439 (95%CI: 0.8245, 0.8561), respectively] . The AUC values of GBDT and XGBoost models trained by feature set 4 were higher than that of logistic regression model. Conclusion The performance of the prediction models constructed by predictors screened out by two feature selection methods is better than those by single feature selection methods. At the same time, under the condition of optimal feature set, the performance of model prediction based on Boosting is better than that of traditional logistic regression model.

18.
International Eye Science ; (12): 2081-2086, 2023.
Article Dans Chinois | WPRIM | ID: wpr-998494

Résumé

AIM: To observe the changes in the Chang-Warning chord(CW chord)before and after cataract surgery using the IOL Master 700 and predict the CW chord using an artificial intelligence prediction model and preoperative measurement data.METHODS: The analysis was conducted on the preoperative and postoperative IOL Master 700 measurements of 304 cataract patients. This included astigmatism vector value, average keratometry, axial length, anterior chamber depth, lens thickness, corneal central thickness, white-to-white, the position of the Purkinje reflex I image relative to the corneal center and pupil center, and the CW chord. A prediction model based on the SVR algorithm and the BP neural network algorithm was established to predict the postoperative CW chord using the preoperative CW chord and ocular biological parameters.RESULTS: The X component of the CW chord showed a slight shift in the temporal direction in both the left and right eyes after cataract surgery, while the Y component changed little. The SVR model, using the preoperative CW chord and other preoperative biometric parameters as input data, was able to predict the X and Y components of the CW chord more accurately than the BP neural network.CONCLUSION: The CW chord can be directly measured with a coaxial fixation light using various biometers, corneal topographers, or tomographers. The use of the SVR algorithm can accurately predict the postoperative CW chord before cataract surgery.

19.
China Pharmacy ; (12): 2333-2338, 2023.
Article Dans Chinois | WPRIM | ID: wpr-996388

Résumé

OBJECTIVE To optimize the pressurized processing technology of Strychnos nux-vomica boiled with mung beans. METHODS The least squares method was used to establish a one-dimensional model for the effects of four factors, namely, processing time, processing pressure, mung bean dosage and water added, on the contents of strychnine and toxiferine, and the multivariate model hypothesis was proposed by analyzing the function of one-dimensional model. Based on the orthogonal experiment, the genetic algorithm was used to solve the undetermined coefficients in the model. A bi-objective optimization model based on strychnine and toxiferine content was constructed according to the actual conditions, and the optimal technology was obtained by solving the model function and validated. RESULTS The optimal processing technology was boiling S. nux-vomica with mung beans at 2.393 MPa saturated steam pressure for 5.5 h, and then draining; rinsing to remove mung beans, scraping off the bark of S. nux-vomica and cutting into slice of 0.6 mm; using 180 g of mung beans and 15 L of water per 500 g of S. nux- vomica. CONCLUSIONS The optimized pressurized processing technology is stable and feasible, and can provide a reference for the optimization of processing technology of S. nux-vomica boiled with mung beans.

20.
Chinese Journal of Hospital Administration ; (12): 383-386, 2023.
Article Dans Chinois | WPRIM | ID: wpr-996094

Résumé

Artificial intelligence algorithms play an important role in the medical field with their unique technological advantages and operational logic. However, due to the " black box" nature and security flaws of artificial intelligence algorithms, there were problems in the application of intelligent medical algorithms, such as the alienation of medical personnel′s subject status, the alienation of right to informed consent for diagnosis and treatment, the alienation of health fairness, and the alienation of health information security. At the same time, there were some practical difficulties in the process of regulating risks, such as the difficulty in determining the responsibility of the legal person, the incompleteness of the causal chain, the lag of the scene regulation of laws and regulations, and the increasing difficulty in protecting health information. In order to prevent the alienation of power in medical intelligent algorithms and promote the healthy development of intelligent medicine, the author suggested to adhere to the embedding of tool attributes in the underlying logic of medical algorithms, enhance the interpretability of medical algorithms, improve relevant laws and regulations, and strengthen the risk supervision mechanism of intelligent medical algorithms.

SÉLECTION CITATIONS
Détails de la recherche