RÉSUMÉ
ObjectiveTo achieve high-dimensional prediction of class imbalanced of adverse drug reaction(ADR) of traditional Chinese medicine(TCM) and to classify and identify risk factors affecting the occurrence of ADR based on the post-marketing safety data of TCM monitored centrally in real world hospitals. MethodThe ensemble clustering resampling combined with regularized Group Lasso regression was used to perform high-dimensional balancing of ADR class-imbalanced data, and then to integrate the balanced datasets to achieve ADR prediction and the risk factor identification by category. ResultA practical example study of the proposed method on a monitoring data of TCM injection performed that the accuracy of the ADR prediction, the prediction sensitivity, the prediction specificity and the area under receiver operating characteristic curve(AUC) were all above 0.8 on the test set. Meanwhile, 40 risk factors affecting the occurrence of ADR were screened out from total 600 high-dimensional variables. And the effect of risk factors on the occurrence of ADR was identified by classification weighting. The important risk factors were classified as follows:past history, medication information, name of combined drugs, disease status, number of combined drugs and personal data. ConclusionIn the real world data of rare ADR with a large amount of clinical variables, this paper realized accurate ADR prediction on high-dimensional and class imbalanced condition, and classified and identified the key risk factors and their clinical significance of categories, so as to provide risk early warning for clinical rational drug use and combined drug use, as well as scientific basis for reevaluation of safety of post-marketing TCM.
RÉSUMÉ
Las estrategias para el éxito en la rehabilitación bucal requieren de la interrelación de varias disciplinas que en conjunto logren resultados predecibles y duraderos. La visión individualizada de cada área de especialidad puede conllevar a no ofrecer la mejor alternativa de tratamiento, es por ello que la valoración, el diagnóstico y la planificación del caso clínico debe ser realizada por un equipo interdisciplinario para evitar esta situación y crear una sinergia en donde el «todo sea mayor que la suma de sus partes¼. El objetivo de este trabajo es presentar un caso clínico en el cual intervinieron varias áreas de especialidad: periodoncia, prostodoncia, cirugía oral y patología bucal, logrando devolver la función y la estética a través del manejo interdisciplinario (AU)
The strategies for success in oral rehabilitation require the interrelation of several disciplines, which together, achieve predictable and lasting results. The individualized view of each specialty area may lead to not offering the best treatment alternative, which is why the assessment, diagnosis, and planning of the clinical case must be carried out by an interdisciplinary team to avoid this situation and create a synergy in where the «whole is greater than the sum of its parts¼. The objective of this work is to present a clinical case where several areas of specialty intervened: periodontics, prosthodontics, oral surgery, and oral pathology, thus achieving the return of function and aesthetics through interdisciplinary management (AU)
Sujet(s)
Humains , Femelle , Adulte d'âge moyen , Équipe soignante , Procédures de chirurgie préprothétique en odontologie/méthodes , Rééducation buccale , Parodontite/thérapie , École dentaire , Satisfaction des patients , Photographie dentaire , Planification anticipée des soins , Prothèse dentaire complète immédiate , Dentisterie esthétique , Reconstruction de crête alvéolaire/méthodes , Frein labial/chirurgie , MexiqueRÉSUMÉ
Speech feature learning is the core and key of speech recognition method for mental illness. Deep feature learning can automatically extract speech features, but it is limited by the problem of small samples. Traditional feature extraction (original features) can avoid the impact of small samples, but it relies heavily on experience and is poorly adaptive. To solve this problem, this paper proposes a deep embedded hybrid feature sparse stack autoencoder manifold ensemble algorithm. Firstly, based on the prior knowledge, the psychotic speech features are extracted, and the original features are constructed. Secondly, the original features are embedded in the sparse stack autoencoder (deep network), and the output of the hidden layer is filtered to enhance the complementarity between the deep features and the original features. Third, the L1 regularization feature selection mechanism is designed to compress the dimensions of the mixed feature set composed of deep features and original features. Finally, a weighted local preserving projection algorithm and an ensemble learning mechanism are designed, and a manifold projection classifier ensemble model is constructed, which further improves the classification stability of feature fusion under small samples. In addition, this paper designs a medium-to-large-scale psychotic speech collection program for the first time, collects and constructs a large-scale Chinese psychotic speech database for the verification of psychotic speech recognition algorithms. The experimental results show that the main innovation of the algorithm is effective, and the classification accuracy is better than other representative algorithms, and the maximum improvement is 3.3%. In conclusion, this paper proposes a new method of psychotic speech recognition based on embedded mixed sparse stack autoencoder and manifold ensemble, which effectively improves the recognition rate of psychotic speech.
Sujet(s)
Humains , Algorithmes , Bases de données factuelles , Troubles psychotiques , Parole , Perception de la paroleRÉSUMÉ
Resumen Objetivo: Presentar un algoritmo estable que determina, a partir de mediciones electroencefalográficas, los parámetros de fuentes de tipo dipolar asociadas a focos epilépticos ubicados sobre la superficie de la corteza cerebral. Metodología: Se utiliza un problema de contorno para establecer correlaciones entre la fuente y la medición. El problema se divide en dos subproblemas lineales y en cada uno de ellos, se utilizan el método de mínimos cuadrados y la regularización de Tikhonov para encontrar soluciones estables. Estos subproblemas son problemas mal planteados en el sentido de Hadamard, debido a la inestabilidad numérica que presentan, es decir, pequeños cambios en las mediciones pueden producir grandes variaciones en la solución de cada problema. El parámetro de regularización de Tikhonov fue elegido usando el método de la curva L. Para hallar la solución del problema de contorno se utiliza el método de las series de Fourier y el Método del Elemento Finito. Resultados: Se propuso un tipo de fuente para representar a los focos epilépticos en la corteza cerebral y un algoritmo estable para el problema de identificación de los parámetros de dichas fuentes. Se desarrollaron ejemplos sintéticos y programas en MATLAB para el caso de geometría simple bidimensional. Originalidad: La separación del problema original en dos subproblemas así como los ejemplos sintéticos son producto de esta investigación. Conclusión general: Se propuso un algoritmo estable que determina a los parámetros de fuentes de corriente dipolar definidas en la corteza cerebral.
Abstract Objective: To present a stable algorithm that determines, from electroencephalographic measurements, the parameters of dipolar sources associated with epileptic foci located on the cerebral cortex. Methodology: A boundary value problem is used to establish correlations between the sources and the measurements. The problem is divided into two linear subproblems and in each one, the method of Minimum Square and the Tikhonov regularization are used for finding stables solutions. These subproblems are an ill-posed problem in the Hadamard sense, which is due to the numerical instability, that is, small changes in the data can produce substantial variations in the solution of each problem. The Tikhonov regularization parameter was chosen using the L curve method. To find the solution of the boundary value problem are used the Fourier series method and the Finite Element Method. Results: A type of source that represents the epileptic foci on the cerebral cortex and a stable algorithm for finding the parameter of these sources were proposed. Synthetics examples and MATLAB programs were developed for the case of bidimensional geometry. Originality: The separation of the original problem into two subproblems and the synthetics examples are a product of this research. Conclusion: A stable algorithm was proposed for determining the parameters of the dipolar current defined on the cerebral cortex.
RÉSUMÉ
Survival analysis mainly deals with the time to event, including death, onset of disease, and bankruptcy. The common characteristic of survival analysis is that it contains “censored” data, in which the time to event cannot be completely observed, but instead represents the lower bound of the time to event. Only the occurrence of either time to event or censoring time is observed. Many traditional statistical methods have been effectively used for analyzing survival data with censored observations. However, with the development of high-throughput technologies for producing “omics” data, more advanced statistical methods, such as regularization, should be required to construct the predictive survival model with high-dimensional genomic data. Furthermore, machine learning approaches have been adapted for survival analysis, to fit nonlinear and complex interaction effects between predictors, and achieve more accurate prediction of individual survival probability. Presently, since most clinicians and medical researchers can easily assess statistical programs for analyzing survival data, a review article is helpful for understanding statistical methods used in survival analysis. We review traditional survival methods and regularization methods, with various penalty functions, for the analysis of high-dimensional genomics, and describe machine learning techniques that have been adapted to survival analysis.
Sujet(s)
Faillite , Génomique , Apprentissage machine , Méthodes , Analyse de survieRÉSUMÉ
To assess the background field removal method usually used in quantitative susceptibility mapping (QSM), and to analyze the cause of serious artifacts generated in the truncated -space division (TKD) method, this paper discusses a variety of background field removal methods and proposes an improved method to suppress the artifacts of susceptibility inversion. Firstly, we scanned phase images with the gradient echo sequence and then compared the quality and the speed of reconstructed images of sophisticated harmonic artifact reduction for phase data (SHARP), regularization enable of SHARP (RESHARP) and laplacian boundary value (LBV) methods. Secondly, we analyzed the reasons for reconstruction artifacts caused by the multiple truncations and discontinuity of the TKD method, and an improved TKD method was proposed by increasing threshold truncation range and improving data continuity. Finally, the result of susceptibility inversion from the improved and original TKD method was compared. The results show that the reconstruction of SHARP and RESHARP are very fast, but SHARP reconstruction artifacts are serious and the reconstruction precision is not high and implementation of RESHARP is complicated. The reconstruction speed of LBV method is slow, but the detail of the reconstructed image is prominent and the precision is high. In the QSM inversion methods, the reconstruction artifact of the original TKD method is serious, while the improved method obtains good artifact suppression image and good inversion result of artifact regions.
Sujet(s)
Algorithmes , Artéfacts , Encéphale , Traitement d'image par ordinateur , Imagerie par résonance magnétique , Fantômes en imagerieRÉSUMÉ
Abstract: The Native Vegetation Protection Law - 2012 - (NVPL) is the main Brazilian regulation for protecting native vegetation (NV) on private land. The NVPL, currently in the implementation phase, reduced Legal Reserves (LR) requirements compared to its previous version, the 1965's Forest Act (FA), through several legal mechanisms. Among them, Article 68 (Art.68) exempts landholders from LR obligations if NV was converted without offending the legislation in place at the time of the conversion. The technical implementation of Art. 68 is controversial and its effects are still unknown. We developed a model to estimate the effects of Art.68 on LR using São Paulo State (Brazil) as case study. We analyzed former environmental laws to identify key periods in which NV preservation requirements had changed. After, we searched for past spatial data on NV cover with sufficient accuracy for each legal benchmark. Combining legal benchmarks with spatial data, we created two scenarios for Art.68 effects, plus a baseline scenario. The first scenario considered a single legal benchmark, the 1965's FA (scenario "1965"), while the other included the 1989 Cerrado's protection Federal Law as a second benchmark (scenario "1965/89"). The baseline scenario did not include Art.68 effects. Scenario "1965" reduced LR deficits in 49% compared to the baseline scenario, waiving landholders from restoration or offsetting needs in 423 thousand hectares (kha) of NV. Scenario "1965/89" waved 507 kha of NV from restoration needs and represented a 59% reduction in LR deficit compared to the baseline scenario. The LR reduction by scenario "1965/89" assumed particular importance considering that the additional cutback was concentrated on Cerrado, an already very fragmented and impacted region. Together with reductions from other NVPL rules, the additional effects of Art. 68 unfolded great concerns about the role of LR as a tool for NV preservation on private land, threating governmental restoration commitments, and pointing that conservation command and control approaches should be complemented with incentive policies to achieve the desired and committed standards.
Resumo: A Lei de Proteção da Vegetação Nativa - 2012 - (LPVN) é a principal lei brasileira para proteção da vegetação nativa (VN) em terras privadas. A LPVN, atualmente em fase de implementação, reduziu os requerimentos de Reserva Legal (RL) presentes no Código Florestal (CF) de 1965 através de uma série de mecanismos legais. Entre eles, o Artigo 68 (Art.68) elimina a obrigação de recomposição ou restauração da VN convertida sem violação da lei vigente à época da conversão. O Art.68 é um dos mais controversos mecanismos da LPVN e cujos efeitos ainda não são conhecidos. Nós desenvolvemos um modelo para estimar os efeitos do Art.68 utilizando o estado de São Paulo, Brasil, como estudo de caso. Para isso, levantamos marcos legais nos quais os requerimentos mínimos de preservação da VN foram alterados. Em seguida, levantamos a existência de dados espaciais da cobertura de VN com a precisão necessária para cada marco legal. Combinando os marcos legais com os dados espaciais encontrados, criamos dois cenários incluindo os efeitos do Art.68 e um cenário linha de base para controlar tais efeitos. O primeiro cenário considerou apenas um marco legal, o CF de 1965 (cenário "1965"), enquanto o segundo incluiu a Lei Federal de proteção ao Cerrado de 1989 (cenário "1965/89"). O cenário "1965" reduz os déficits de RL em 49% quando comparado ao cenário de base, dispensando os proprietários de terra da obrigação de restaurar ou recompor 423 mil hectares (kha) de VN. O cenário "1989/65" dispensa da obrigação de restauração ou recomposição 507 kha de VN, representando uma redução de 59% do déficit de RL em comparação ao cenário base. A redução apresentada pelo cenário "1965/89" assume grande importância uma vez que se concentra em áreas de Cerrado, bioma já extremamente fragmentado e impactado. Em conjunto com as reduções promovidas por outros Artigos da LPVN, estes efeitos revelam grande preocupação sobre o papel das RL como uma ferramenta para a conservação de VN em terras privadas, ameaçando compromissos governamentais de restauração e indicando que estratégias de comando e controle deverão ser complementadas por políticas de incentivo para atingir os objetivos de conservação desejados.
RÉSUMÉ
Pediatric information for medication is insufficiency in the instruction due to the absence of clinical study special for children. Off-label drug use is ubiquitous in pediatric clinical works,and likely to medical risks and medical disputes. We should be confronted with the reality to standardize the off-label drug use. The enactment of the law,norms and standards setting,medication information study,and intensive super-vision are the key strategies for off-label drug use in children.
RÉSUMÉ
The inverse problem of electrical impedance tomography (EIT) is seriously ill-posed, which restricts the clinical application of EIT. Regularization is an important numerical method to improve the stability of the EIT inverse problem as well as the resolution of the imaging. This paper proposes a self-diagnosis regularization method based on Tikhonov regularization and diagonal weight regularization method (DWRM). Firstly, the ill-posedness of the inverse problem is analyzed by sensitivity. Then, the performance of the self-diagnosis regularization is analyzed through the singular value theory. Finally, some simulated experiments including simulations and flume experiment are carried out and verify that the self-diagnosis regularization has better image quality and anti-noise ability than those of traditional regularization methods. The self-diagnosis regularization method weakens the ill-posedness of inverse problem of EIT and can prompt the practical application of EIT.
RÉSUMÉ
RESUMO Por muito tempo o exercício de atividades econômicas ocorreu na ausência de normas reguladoras. Com a normatização, muitos empreendimentos enquadraram-se em situação de irregularidade, sobretudo por ocuparem obras situadas em áreas especialmente protegidas. Nessa condição, a manutenção ou o desfazimento dessas obras deve se basear em uma análise comparativa dos impactos decorrentes de ambas as alternativas, entretanto a falta de critérios e de padronização nos procedimentos tem prejudicado o adequado tratamento e respectiva tomada de decisões. Assim, o presente trabalho teve como objetivo apresentar uma proposta metodológica para análise ambiental comparativa aplicada à adequação de empreendimentos irregulares. Para tanto, foi construído um índice global de impacto, mediante revisão bibliográfica, que possibilitou identificar os parâmetros envolvidos e o seu equacionamento com o uso de matrizes de interação. O método foi aplicado em um empreendimento que realizou intervenção em uma Área de Preservação Permanente, no limite urbano consolidado sem prévia autorização do órgão licenciador. Como resultado, alcançou-se um índice capaz de subsidiar tomadas de decisão quanto à manutenção ou ao desfazimento de empreendimentos irregulares, levando em consideração atributos como duração, extensão e intensidade no equacionamento da magnitude do impacto, bem como os parâmetros de acumulação, reversibilidade e sensibilidade para avaliar sua importância. Embora o método proposto não elimine a subjetividade da avaliação ambiental, contribui para sua padronização por meio de um procedimento lógico e organizado que propicia comparação fundamentada em parâmetros quantitativos, de maneira especial para empreendimentos de menor porte, em geral avaliados de acordo com atributos unicamente qualitativos.
ABSTRACT For a long time, economic activities occurred in the lack of legal rules. Due to advancement of legislation, many activities became in situation of irregularity, mainly by occupying buildings in protected areas. In this condition, the maintenance or destruction of these buildings should be based on an impact comparative analysis of both alternatives. However, a lack of criteria and standardization of procedures has undermined the proper treatment and decision making. Thus, the purpose of this paper was to introduce a methodological proposal for comparative environmental analysis applied to the settlement of irregular activities. For that, an impact global index was constructed, by means of literature review to identify the parameters involved, as well as to its formulation with use of interaction matrices. The method was applied for assessing an activity that made intervention in permanent preservation area, within consolidated urban area without authorization of competent agency. As a result, an index able to support making-decision about maintenance or destruction of irregular buildings was obtained, considering attributes as duration, extension and intensity in the formulation of magnitude of the impact, as well as the parameters of accumulation, reversibility and sensibility to evaluate its importance. Although the proposed method does not eliminate the subjectivity of the environmental assessment, it contributes to standardization through a logical and organized procedure that allows a comparison based on quantitative parameters, especially for smaller projects, usually assessed only according to qualitative attributes.
RÉSUMÉ
Objective To improve the image quality of the electrical impedance tomography (EIT) by introducing the prior information into the regularization matrix.Methods The linear combination of the conductivity was established by background conductivity of dynamic variation,the covariance matrix was used here to remove the correlation between the background conductivity,and this prior information was introduced to construct the regularization matrix.Resnlts Compared with the traditional regularization matrix,the one involving in the prior information on the dynamic background gained more stable and better images.Conclusion Trials prove the efficacy of the regularization matrix on EIT imaging in 1 respiratory cycles (or heart beat),and following related researches may find theoretical references and support for feasibility.
RÉSUMÉ
Este artigo aborda aspectos políticos e clínicos do problema da regulamentação em sua relação com a psicanálise. Na vertente política, destaca-se que as tentativas de regulamentação da psicanálise pelo Estado têm se multiplicado pelo mundo, sendo crucial compreender seu contexto. Na vertente clínica, a regulamentação é vista como um aspecto central das novas configurações culturais, a partir das quais os sintomas se produzem. Busca-se investigar as consequências do privilégio dado ao saber administrativo e burocrático como resposta aos sintomas, bem como propor estratégias clínicas diante desse cenário, especialmente a partir dos conceitos de gozo e objeto a.
This article discusses political and regulatory aspects of the clinical problem in her relationship with psychoanalysis. On the political aspect, we highlight that the State efforts to regulate psychoanalysis have been increasing around the world, it is crucial to understand its context. In clinical aspect, the regulation is seen as a key aspect of the new cultural configurations, from which the symptoms are produced. The aim is to investigate the consequences of the privilege given to the administrative and bureaucratic knowledge in response to symptoms as well as to propose clinical strategies in this scenario, especially from the enjoyment and object concepts.
Este artículo habla acerca de aspectos políticos y clínicos del problema de la reglamentación en su relación con el psicoanálisis. En la vertiente política, se nota que los intentos de reglamentación del psicoanálisis por el Estado se han multiplicado por el mundo, siendo de total importancia la comprensión de su contexto. En la vertiente clínica, la reglamentación es vista como un aspecto central de las nuevas configuraciones culturales, donde los síntomas son producidos. Se busca investigar las consecuencias del privilegio dado al saber administrativo y burocrático como respuesta a los síntomas, así como proponer estrategias clínicas delante de este escenario, especialmente después de los conceptos de gozo y objeto a.
Cet article analyse les aspects politiques et cliniques du problème de la régularisation dans sa relation avec la psychanalyse. Du côté de la politique, on détache que les tentatives de régularization de la psychanalyse pour l'État se sont multipliées autour du monde, ce qui montre la grande importance de comprendre son contexte. Du côté de la clinique, la régularisation est vue comme l'aspect central des nouvelles configurations culturelles dont se produisent les symptômes. On recherche analyser les conséquences des privilèges donnés au savoir administratif et bureaucratique comme réponse aux symptômes, et aussi proposer des stratégies cliniques pour cette situation, surtout à partir des concepts de juissance et objet a.
Sujet(s)
Humains , Mâle , Femelle , Psychanalyse , Réglementation gouvernementaleRÉSUMÉ
In recent years, genome-wide association (GWA) studies have successfully led to many discoveries of genetic variants affecting common complex traits, including height, blood pressure, and diabetes. Although GWA studies have made much progress in finding single nucleotide polymorphisms (SNPs) associated with many complex traits, such SNPs have been shown to explain only a very small proportion of the underlying genetic variance of complex traits. This is partly due to that fact that most current GWA studies have relied on single-marker approaches that identify single genetic factors individually and have limitations in considering the joint effects of multiple genetic factors on complex traits. Joint identification of multiple genetic factors would be more powerful and provide a better prediction of complex traits, since it utilizes combined information across variants. Recently, a new statistical method for joint identification of genetic variants for common complex traits via the elastic-net regularization method was proposed. In this study, we applied this joint identification approach to a large-scale GWA dataset (i.e., 8842 samples and 327,872 SNPs) in order to identify genetic variants of obesity for the Korean population. In addition, in order to test for the biological significance of the jointly identified SNPs, gene ontology and pathway enrichment analyses were further conducted.
Sujet(s)
Pression sanguine , Étude d'association pangénomique , Articulations , Obésité , Polymorphisme de nucléotide simpleRÉSUMÉ
Este trabalho aborda a interpelação exercida pelo espaço urbano nos processos de subjetivação do homem contemporâneo, com especial atenção a situações relacionadas ao direito e ao uso do solo. É realizado um breve histórico do conceito de uso social do solo no país, de políticas públicas que vêm sendo implantadas com vistas à produção de habitação popular e à regularização fundiária de favelas e apresentadas impressões de moradores da favela da Rocinha, localizada na cidade do Rio de Janeiro, em face deste processo, ora em andamento em algumas de suas áreas. Os relatos dos moradores foram obtidos a partir dos trabalhos de campo realizados no âmbito da pesquisa Espaço Urbano Contemporâneo e Subjetividade: um foco especial sobre as favelas do Rio de Janeiro, sediada no Programa de Pós-Graduação em Psicologia Social, da UERJ, com apoio financeiro Prodoc/CAPES.
This work approaches the interpellation practiced by the urban space in the processes of subjective construction of the contemporary man, with special attention to situations related to rights to and use of ground. A short report is presented, comprehending the concepts of social use of ground in the country, public politics that have been introduced aiming at production of popular construction and agrarian regularization of the favelas, and of impressions from residents of favela da Rocinha - located in the city of Rio de Janeiro - in face of this process, currently ongoing in part of its area. The accounts of residents were obtained from fieldwork done in the scope of the research Contemporary Urban Space and Subjectivity: a special focus on the favelas of Rio de Janeiro, held at the Programa de P¢s-Graduao em Psicologia Social, of UERJ, with the financial support of Prodoc/CAPES.
Sujet(s)
Villes , Gestion et Plannification de la Terre , Zones de pauvreté , Psychologie , Facteurs socioéconomiques , BrésilRÉSUMÉ
In this paper, we consider the variable selection methods in the Cox model when a large number of gene expression levels are involved with survival time. Deciding which genes are associated with survival time has been a challenging problem because of the large number of genes and relatively small sample size (n << p). Several methods for variable selection have been proposed in the Cox model. Among those, we consider least absolute shrinkage and selection operator (LASSO), threshold gradient descent regularization (TGDR), and two different clustering threshold gradient descent regularization (CTGDR)- the K-means CTGDR and the hierarchical CTGDR - and compare these four methods in an application of lung cancer data. Comparison of the four methods shows that the two CTGDR methods yield more compact gene selection than TGDR, while LASSO selects the smallest number of genes. When these methods are evaluated by the approach of Ma and Huang (2007), none of the methods shows satisfactory performance in separating the two risk groups using the log-rank statistic based on the risk scores calculated from the selected genes. However, when the risk scores are calculated from the genes that are significant in the Cox model, the performance of the log-rank statistics shows that the two risk groups are well separated. Especially, the TGDR method has the largest log-rank statistic, and the K-means CTGDR method and the LASSO method show similar performance, but the hierarchical CTGDR method has the smallest log-rank statistic.
Sujet(s)
Analyse de regroupements , Expression des gènes , Tumeurs du poumon , Poumon , Taille de l'échantillonRÉSUMÉ
Objective To study the application of the linear sampling method(LSM) in 2D electromagnetic inverse scattering imaging and to mainly discuss the imaging quality of LSM with partial-view data.Methods Get the norm of solution vector to the equivalent integral equation using numerical calculation in every sampling point at imaging region,and represent the norm with gray to reconstruct the scattering shape.Results Under all incident angles [0,360?] and all observation angles [0,360?],the imaging quality of the far-field pattern of scattering wave was obtained very well.But when reducing the range of the incident angles and observation angles,the imaging quality declined.Particularly,the imaging quality in the case of both incident angles and observation angles being at the same side of the observed object,were much better than both in opposite sides.Furthermore,the profile of the object was imaged more clearly as its position corresponding to the orientation of the incident/observation angle.Conclusion Although the imaging quality will decline with partial-view data in electromagnetic inverse scattering,the effectiveness of imaging with LSM still deserves attention.
RÉSUMÉ
The application of improved BP neural network together with the wavelet transform to the classification of surface EMG signal is described. The data reduction and preprocessing of the signal are performed by wavelet transform. The network can identify such four kinds of forearm movements with a high accuracy as hand extension, clench fist, forearm pronation and forearm supination. This paper compares the results by standard BP algorithm with that of Bayesian regularization together with LM algorithm. Experimental result shows that the improved BP neural network has a great potential when applied to electromechanical prosthesis control because of its enhanced training speed and identification accuracy.
RÉSUMÉ
Because supraventricular tachyarrhythmias after open heart surgery are often resistant to DC cardioversion and treatment with antiarrhythmic agents, we sometimes have difficulty in the postoperative management of these arrhythmias. We attempted to use intravenous infusion of diltiazem hydrochloride (3-5mcg/kg/min) for 6 patients with supraventricular tachyarrhythmias, 5 of whom had atrial fibrillation and 1 with sinus tachycardia after open heart surgery. The ventricular rate was remarkably reduced from the pretreatment value by this infusion therapy. Diltiazem infusion during atrial fibrillation in 5 patients regularized the ventricular rate (normalization of R-R intervals). These results indicate that diltiazem was effective in obtaining almost constant preload with each cardiac cycle for the postoperative deteriorated cardiac muscle. The hemodynamic parameters obtained with the Swan-Ganz catheter showed that both right and left ventricular functions improved after the infusion of diltiazem. There was no adverse effect due to the administration of diltiazem. We concluded that the intravenous infusion of diltiazem is an effective method to manage supraventricular tachyarrhythmias after open heart sugery without deterioration of the cardiac function or side effects.