RESUMEN
Objective:To discuss the regulatory effect of berberine(BBR)on fatty acids in the human glioma T98G cells and its effect on the cell proliferation,migration,and invasion,and to clarify its potential mechanism.Methods:The T98G cells at logarithmic growth phase were divided into control group and different concentrations(25,50,and 100 mg·L-1)of BBR groups.Cell wound healing assay was used to detect the migration rates of the cells in various groups;Transwell chamber assay was used to detect the invasion rates of the cells in various groups.The T98G cells at logarithmic growth phase were divided into control group and 100 mg·L-1 BBR group,and Mass spectrometry was used to detect the fatty acid contents in the cells in two groups.The T98G cells at logarithmic growth phase were divided into control group and different concentrations(50,100,and 150 mg·L-1)of BBR groups;Western blotting method was used to detect the expression levels of phosphatidylinositol 3-kinase(PI3K),phosphorylated PI3K(p-PI3K),protein kinase B(AKT),phosphorylated AKT(p-AKT),sterol regulatory element-binding protein 1(SREBP-1),and fatty acid synthase(FASN)in the cells in various groups.The expression of FASN was suppressed by gene silencing technology,and the T98G cells at logarithmic growth phase were divided into control group,shFASN1 group,and shFASN2 group.Western blotting method was used to detect the expression levels of FASN protein in the cells in various groups;clone formation assay was used to detect the clone formation of the cells in various groups;cell wound healing assay was used to detect the migration rates of the cells in various groups.Results:Compared with control group,the migration rates and invasion rates of the cells in different concentrations of BBR groups were decreased in a concentration-dependent manner(P<0.01),and the fatty acid content in the cells in 100 mg·L-1 BBR group was significantly decreased(P<0.01).Compared with control group,the expression levels of p-PI3K,p-AKT,SREBP-1,and FASN proteins in the cells in 150 mg·L-1 BBR group were significantly decreased(P<0.05 or P<0.01),and the expression level of SREBP-1 protein in the cells in 100 and 150 mg·L-1 BBR groups were significantly decreased(P<0.01).After suppression of FASN expression,compared with control group,the expression levels of FASN protein in the cells in shFASN1 and shFASN2 groups were significantly decreased(P<0.01),and the expression level of FASN protein in the cells in shFASN2 group was lower than that in shFASN1 group(P<0.05);compared with control group,the numbers of clone formation and migration rates of the cells in shFASN1 and shFASN2 groups were significantly decreased(P<0.01),and the migration rate of the cells in shFASN2 group was significantly lower than that in shFASN1 group(P<0.05).Conclusion:BBR interferes with fatty acid synthesis in the glioma T98G cells by reducing the expression of the PI3K/AKT/SREBP-1/FASN pathway related proteins,and decrease their migration and invasion capabilities.
RESUMEN
【Objective】 To investigate the physical and neuropsychological development of the offspring born to mothers with gestational diabetes mellitus (GDM) at 2 years of age, and to provide evidence to enhance the physical and neuropsychological development of GDM offspring. 【Methods】 A retrospective analysis was conducted on neonates born in the Department of Obstetrics at Qinzhou Maternal and Child Health Hospital from January 2018 to December 2018 and regularly followed at the outpatient service. The neonates were categorized into two groups based on whether their mothers were diagnosed with GDM during pregnancy: the GDM group (n=243) and the control group (n=362). The general clinical data, follow-up information on physical development and neuropsychological development at 1 year and 2 years of age for all children were collected. Their height, head circumference, body weight, BMI, and Gesell developmental quotients (DQs) at 1 year and 2 years of age for both groups were analyzed. 【Results】 1) There were no significant differences in height, head circumference, body weight, and body mass index (BMI) between the two groups at 1 year and 2 years of age during the follow-up period (P>0.05). 2) At 1 year of age, the GDM group exhibited higher rates of abnormal language development (8.6% vs. 3.3%, χ2=7.854), adaptive behavior(11.4% vs. 5.0%,χ2=8.605), and personal social behavior(8.2% vs. 3.0%, χ2=8.062) compared to the control group (P<0.05), and lower DQs for these Gesell subscales (language development 87.6±7.7 vs. 89.4±9.2, t=2.591; adaptive behavior: 88.4±7.8 vs. 90.5±8.9, t=2.957; personal social behavior: 89.1±7.0 vs. 91.2±7.5, t=3.495, P<0.05). 3) At 2 year of age, the GDM group also showed higher rates of adaptive behavior (8.2% vs. 4.1%, χ2=3.927) and personal social behavior (7.3% vs. 3.0%, χ2=4.093) compared to the control group (P<0.05), and lower DQs for these Gesell subscales (adaptive behavior: 89.5±6.5 vs. 91.9±6.9, t=3.878; personal social behavior: 89.9±7.1 vs. 92.1±6.9, t=3.311, P<0.05). 【Conclusions】 The development of adaptive behavior and personal social behavior in offspring born to mothers with GDM remains delayed. Follow-up for GDM offspring should prioritize achieving a balanced development of adaptive behavior and personal social behavior.
RESUMEN
Diabetic macular edema(DME)is a complication of diabetic retinopathy(DR), and is also the main cause of vision loss and blindness in DR patients. Optical coherence tomography(OCT)and optical coherence tomography angiography(OCTA)serve as the principal methods for the non-invasive assessment of microstructural and microvascular pathological changes in the retina. They are widely-used methods for detecting and evaluating DME. As OCT and OCTA technologies advance, various parameters have assumed the role of biomarkers, such as central subfield thickness(CST), cube average thickness(CAT), cube volume(CV), disorganization of retinal inner layers(DRIL), hyperreflective foci(HRF)and subfoveal neuroretinal detachment(SND). OCT and OCTA are widely used in clinical practice. OCT can visually show the layer changes and subtle structures of the retina and choroid in the macular area, while OCTA is more often used to detect microvascular changes. In this article, the role of OCT and OCTA-related biomarkers in prognosis and monitoring in DME is described, while the biomarkers visible in the test results can provide new ideas for monitoring and treatment strategies in DME, and provide new insights into the pathogenesis of DR and DME.
RESUMEN
【Objective】 To objectively evaluate the quality control level of blood testing process in blood banks through quantitative monitoring and trend analysis, and to promote the homogenization level and standardized management of blood testing laboratories in blood banks. 【Methods】 A quality monitoring indicator system covering the whole process of blood collection and supply, including blood donation service, blood component preparation, blood testing, blood supply and quality control was established. The questionnaire Quality Monitoring Indicators for Blood Collection and Supply Process with clear definition of indicators and calculation formulas was distributed to 17 blood banks in Shandong province. Quality monitoring indicators of each blood bank from January to December 2022 were collected, and 31 indicators in terms of blood testing were analyzed using SPSS25.0 software. 【Results】 The proportion of unqualified serological tests in 17 blood bank laboratories was 55.84% for ALT, 13.63% for HBsAg, 5.08% for anti HCV, 5.62% for anti HIV, 18.18% for anti TP, and 1.65% for other factors (mainly sample quality). The detection unqualified rate and median were (1.23±0.57)% and 1.11%, respectively. The ALT unqualified rate and median were (0.74±0.53)% and 0.60%, respectively. The detection unqualified rate was positively correlated with ALT unqualified rate (r=0.974, P0.05), while the outrage rate was positively correlated with the usage rate (r=0.592, P<0.05). A total of 443 HBV DNA positive samples were detected in all blood banks, with an unqualified rate of 3.78/10 000; 15 HCV RNA positive samples were detected, with an unqualified rate of 0.13/10 000; 5 HIV RNA positive samples were detected, with an unqualified rate of 0.04/10 000. The unqualified rate of NAT was (0.72±0.04)‰, the single NAT reaction rate [(0.39±0.02)‰] was positively correlated with the single HBV DNA reaction rate [ (0.36±0.02) ‰] (r=0.886, P<0.05). There was a difference in the discriminated reactive rate by individual NAT among three blood bank laboratories (C, F, H) (P<0.05). The median resolution rate of 17 blood station laboratories by minipool test was 36.36%, the median rate of invalid batch of NAT was 0.67%, and the median rate of invalid result of NAT was 0.07‰. The consistency rate of ELISA dual reagent detection results was (99.63±0.24)%, and the median length of equipment failure was 14 days. The error rate of blood type testing in blood collection department was 0.14‰. 【Conclusion】 The quality monitoring indicator system for blood testing process in Shandong can monitor potential risks before, during and after the experiment, and has good applicability, feasibility, and effectiveness, and can facilitate the continuous improvement of laboratory quality control level. The application of blood testing quality monitoring indicators will promote the homogenization and standardization of blood quality management in Shandong, and lay the foundation for future comprehensive evaluations of blood banks.
RESUMEN
【Objective】 To establish an effective quality monitoring indicator system for blood quality control in blood banks, in order to analyze the quality control indicators for blood collection and supply, and evaluate blood quality control process, thus promoting continuous improvement and standardizing management of blood quality control in blood banks. 【Methods】 A quality monitoring indicator system covering the whole process of blood collection and supply, including blood donation services, component preparation, blood testing, blood supply and quality control was established. The Questionnaire of Quality Monitoring Indicators for Blood Collection and Supply Process was distributed to 17 blood banks in Shandong, which clarified the definition and calculation formula of indicators. The quality monitoring indicator data from January to December 2022 in each blood bank were collected, and 20 quality control indicators data were analyzed by SPSS25.0 software. 【Results】 The average pass rate of key equipment monitoring, environment monitoring, key material monitoring, and blood testing item monitoring of 17 blood banks were 99.47%, 99.51%, 99.95% and 98.99%, respectively. Significant difference was noticed in the pass rate of environment monitoring among blood banks of varied scales(P<0.05), and the Pearson correlation coefficient (r) between the total number of blood quality testing items and the total amount of blood component preparation was 0.645 (P<0.05). The average discarding rates of blood testing or non-blood testing were 1.14% and 3.36% respectively, showing significant difference among blood banks of varied scales (P<0.05). The average discarding rate of lipemic blood was 3.07%, which had a positive correlation with the discarding rate of non testing (r=0.981 3, P<0.05). There was a statistically significant difference in the discarding rate of lipemic blood between blood banks with lipemic blood control measures and those without (P<0.05). The average discarding rate of abnormal color, non-standard volume, blood bag damage, hemolysis, blood protein precipitation and blood clotting were 0.20%, 0.14%, 0.06%, 0.06%, 0.02% and 0.02% respectively, showing statistically significant differences among large, medium and small blood banks(P<0.05).The average discarding rates of expired blood, other factors, confidential unit exclusion and unqualified samples were 0.02%, 0.05%, 0.003% and 0.004%, respectively. The discarding rate of blood with air bubbles was 0.015%, while that of blood with foreign body and unqualified label were 0. 【Conclusion】 The quality control indicator system of blood banks in Shandong can monitor weak points in process management, with good applicability, feasibility, and effectiveness. It is conducive to evaluate different blood banks, continuously improve the quality control level of blood collection and supply, promote the homogenization and standardization of blood quality management, and lay the foundation for comprehensive evaluation of blood banks in Shandong.
RESUMEN
【Objective】 To establish an effective quality indicator monitoring system, scientifically and objectively evaluate the quality management level of blood banks, and achieve continuous improvement of quality management in blood bank. 【Methods】 A quality monitoring indicator system that covers the whole process of blood collection and supply was established, the questionnaire of Quality Monitoring Indicators for Blood Collection and Supply Process with clear definition of indicators and calculation formulas was distributed to 17 blood banks in Shandong. Statistical analysis of 21 quality monitoring indicators in terms of blood donation service (10 indicators), blood component preparation (7 indicators ), and blood supply (4 indicators) from each blood bank from January to December 2022 were conducted using SPSS25.0 software The differences in quality monitoring indicators of blood banks of different scales were analyzed. 【Results】 The average values of quality monitoring indicators for blood donation service process of 17 blood banks were as follows: 44.66% (2 233/5 000) of regular donors proportion, 0.22% (11/50) of adverse reactions incidence, 0.46% (23/5 000) of non-standard whole blood collection rate, 0.052% (13/25 000) of missed HBsAg screening rate, 99.42% (4 971/5 000) of first, puncture successful rate, 86.49% (173/200) of double platelet collection rate, 66.50% (133/200) of 400 mL whole blood collection rate, 99.25% (397/400) of donor satisfaction rate, 82.68% (2 067/2 500) of use rate of whole blood collection bags with bypass system with sample tube, and 1 case of occupational exposure in blood collection.There was a strong positive correlation between the proportion of regular blood donors and the collection rate of 400 mL whole blood (P<0.05). The platelet collection rate, incidence of adverse reactions to blood donation, and non-standard whole blood collection rate in large blood banks were significantly lower than those in medium and small blood banks (P<0.05). The average quality monitoring indicators for blood component preparation process of 17 blood banks were as follows: the leakage rate of blood component preparation bags was 0.03% (3/10 000), the discarding rate of lipemic blood was 3.05% (61/2 000), the discarding rate of hemolysis blood was 0.13%(13/10 000). 0.06 case had labeling errors, 8 bags had blood catheter leaks, 2.76 bags had blood puncture/connection leaks, and 0.59 cases had non-conforming consumables. The discarding rate of hemolysis blood of large blood banks was significantly lower than that of medium and small blood banks (P<0.05), and the discarding rate of lipemic blood of large and medium blood banks was significantly lower than that of small blood banks (P<0.05). The average values of quality monitoring indicators for blood supply process of 17 blood banks were as follows: the discarding rate of expired blood was 0.023% (23/100 000), the leakage rate during storage and distribution was of 0.009%(9/100 000), the discarding rate of returned blood was 0.106% (53/50 000), the service satisfaction of hospitals was 99.16% (2 479/2 500). The leakage rate of blood components during storage and distribution was statistically different with that of blood component preparation bags between different blood banks (P<0.05). There were statistically significant differences in the proportion of regular blood donors, incidence of adverse reactions, non-standard whole blood collection rate, 400 mL whole blood collection rate, double platelet collection rate, the blood bag leakage rate during preparation process, the blood components leakage rate during storage and distribution as well as the discarding rate of lipemic blood, hemolysis blood, expired blood and returned blood among large, medium and small blood banks (all P<0.05). 【Conclusion】 The establishment of a quality monitoring indicator system for blood donation services, blood component preparation and blood supply processes in Shandong has good applicability, feasibility and effectiveness. It can objectively evaluate the quality management level, facilitate the continuous improvement of the quality management system, promote the homogenization of blood management in the province and lay the foundation for future comprehensive evaluation of blood banks.
RESUMEN
The awake prone position plays an important role in the treatment of hypoxemia and the improvement of respiratory distress symptoms in non-intubated patients. It is widely used in clinical practice because of its simple operation, safety, and economy. To enable clinical medical staff to scientifically and normatively implement prone position for awake patients without intubation, the committees of consensus formulation, guided by evidence-based methodology and Delphi method, conducted literature search, literature quality evaluation and evidence synthesis around seven topics, including indications and contraindications, evaluation, implementation, monitoring and safety management, termination time, complication prevention and health education of awake prone position. After two rounds of expert letter consultation, Expert consensus on implementation strategy of awake prone positioning for non-intubated patients in China (2023) was formulated, and provide guidance for clinical medical staff.
Asunto(s)
Humanos , Consenso , Posición Prona , Vigilia , China , DisneaRESUMEN
Objective To apply the best evidence of airway clearance for ICU patients and promote an application of the best evidence in clinical practice to promte nuring quality.Methods The best evidence of airway clearance for ICU patients was summarised.Based on the best evidence,a system of 11 review indicators was established for clinical baseline review in combination with clinical scenario analysis and professional judgment according to the principle of operability,measurability,and understandability.On the basis of the results of review and the analysis of obstacle factors,strategies of airway clearance for ICU patients were proposed and implemented in clinical practice.Between September and December 2022,72 hospitalised patients and 30 nursing staff in the ICU of a general hospital in Wuhan were recruited in the study.Between September and October 2022,routine nursing care for airway clearances was given to the patients,and evidence-based nursing care for airway clearance was offered to the ICU patients between November and December 2022.Clinical pulmonary infection score,nursing staff's knowledge of airway clearance and implementation rate of review indicators were compared before and after the application of evidence-based nursing.Results All of the patients went through the study.After the application of evidence-based nursing practice,the clinical pulmonary infection score was decreased from(4.94±1.66)to(4.14±1.68).The score of airway clearance knowledge was increased from(49.17±9.38)points to(82.17±10.56)points.The implementation rate of the 11 indicators of evidence-based practice before the evidence-based practice was 0~80.00%,and it was significantly improved up to 96.67%~100.00%after the evidence-based practice(all P<0.05).Conclusion Implementation of evidence-based nursing practice in the airway clearance for ICU patients can reduce clinical pulmonary infections in ICU patients,improve the knowledge of nurses in cognition of airway clearance hence to improve the quality of nursing and promote the recovery of patients.
RESUMEN
Objective:To explore the application effect of case-based learning (CBL), teaching mode combined with 3D printing in clinical teaching of sacral tumors.Methods:A total of 108 undergraduate interns and standardized residency training students who studied in our hospital from 2017 to 2018 were divided into the CBL teaching group ( n = 53) and the CBL combined with 3D printing teaching group ( n = 55) according to their study time. The combined teaching group used computer tomography (CT) data to reconstruct and print out a 3D model of sacral tumors based on CBL, and performed preoperative teaching on the invasion of the surrounding tissues of the tumor. The scores of the students in the two groups were evaluated respectively, and the students were surveyed by self-identification questionnaire (learning interest, self-learning ability, teamwork ability, comprehensive analysis ability and clinical thinking ability). The t-test (one-sided) was used for comparison between groups using stata 14.0. Results:The score of CBL teaching group (75.90±6.70) was lower than that of CBL combined with 3D printing teaching group (83.60±7.40). In terms of critical thinking ability evaluation, self-learning ability, learning interest, comprehensive analysis ability and clinical thinking ability, the CBL combined 3D printing teaching group was superior to the CBL teaching group, and the difference was statistically significant ( P<0.001). In terms of teamwork ability, there was no statistical difference between the two groups. Conclusion:The CBL teaching mode combined with 3D printing can improve academic performance, students' learning interest and clinical thinking ability of sacral tumors in the teaching of undergraduate interns and standardized residency training students.
RESUMEN
Objective:To evaluate the clinical efficacy of warm acupuncture combined with external application of Tibetan medicine Baimai Ointment in the treatment of low-back pain with cold-dampness type.Methods:Randomized controlled trial. Totally 60 outpatients in Tibetan Medicine Hospital of Cuona County from May to July of 2021 were selected as the observation objects, and they were divided into two groups by random number table method, with 30 cases in each group. The control group was treated with Baimai Ointment, and the treatment group was treated with warm acupuncture and Baimai Ointment. Both groups were treated for 2 weeks and followed up for 3 months. VAS scale and Oswestry disability index (ODI) were used to evaluate the low-back pain and dysfunction, and the clinical efficacy was evaluated.Results:The VAS scores of the treatment group were lower than those in the control group immediately after treatment and at the last follow-up ( t=-18.17, -6.05, P<0.01). The ODI score of the treatment group was lower than that of the control group at the last follow-up ( t=-15.86, P<0.01). The total effective rate was 96.7% (29/30) in the treatment group and 93.3% (28/30) in the control group, without statistical significance ( χ2=0.001, P=1.000). Conclusion:Warm acupuncture combined with Tibetan medicine Baimai Ointment can effectively improve the clinical symptoms of low-back pain with cold-dampness type, improve the quality of life of patients, and the clinical effect is satisfactory.
RESUMEN
Objective:To summarize the best evidence of thirst management in ICU patients and provide evidence-based basis for dinical practice.Method:According to the "6S" evidence pyramid model, the literature on thirst management of ICU patients was systematically retrieved from relevant guidelines websites, evidence-based databases, association websites and original literature databases at home and abroad. The retrieval time was from the establishment of the database to June 31, 2022. Two researchers with evidence-based nursing training independently completed literature quality evaluation. To extract and summarize the evidence of the literature that meets the quality standard.Results:A total of 17 articles were included, including 8 randomized controlled trials, 5 quasi-experimental studies and 4 cross-sectional studies. The 18 pieces of best evidence were formed, including 5 aspects: basic requirements of thirst management, intervention evaluation, intervention methods, matters needing attention and health education.Conclusions:This study summarized the best evidence of thirst management in ICU patients. Nurses should translate and apply the best evidence in combination with the clinical situation and specific policies of the department to relieve the thirst symptoms of ICU patients.
RESUMEN
【Objective】 To analyze the current situation of human parvovirus B19 infection in blood donors in different regions of China, so as to provide basis for formulating reasonable screening programs of B19 virus for blood donors in various cities and regions. 【Methods】 The literatures related to human parvovirus B19 infection in whole blood and plasma donors published from 1998 to 2021 were searched in the database, and meta-analysis of literatures that satisfied the inclusion criteria was conducted by R4.1.0 software. 【Results】 A total of 35 literatures were obtained, 20 literatures involving 56 846 blood donor samples and 8 literatures involving 1 608 pooled raw plasma samples were subjected to Meta analysis of the positive rates of B19 DNA; 17 literatures involving 12 308 blood sample were subjected to the Meta analysis of the positive rate of B19 IgG antibody.The positive rates of B19 DNA in blood donors(I2=96%, τ2=0.026 0, P<0.01)and pooled raw plasma (I2=98%, τ2=0.124 5, P<0.01), as well as the positive rate of B19 IgG antibod (I2=98%, τ 2 =0.021 0, P < 0.01) presented significant heterogeneity between regions. The combined positive rate of B19 DNA was estimated to be 2.0% (95%-CI: 0.007~0.039), that of pooled raw plasma for production was 66.6% (95%-CI: 0.476~0.832), and that of B19 IgG antibody was 30.2% (95%-CI: 0.246~0.357). 【Conclusion】 Low HPV B19 infection rate and high positive rate of IgG antibody were found in blood donors. Therefore, the risk of B19 virus infection due to blood transfusion is low, but B19 infections in blood donors varied significantly between regions. Domestic cities and regions should reasonably evaluate their own B19 virus infection status to formulate appropriate B19 virus screening programs for blood donors, so as to reduce blood transfusion transmitted risk of B19 virus. In addition, the infection rate of B19 virus in pooled plasma for production is somewhat high. Recipients should be screened for B19 virus antibodies, and appropriate blood transfusion schemes should be formulated for blood recipients lacking neutralizing B19 IgG antibodies to reduce the exposure of B19 virus.
RESUMEN
Enteral nutrition plays an irreplaceable role in the nutritional treatment of critically ill patients. In order to help clinical medical staff to manage the common complications during the implementations of enteral nutrition for critically ill patients, the consensus writing team carried out literature retrieval, literature quality evaluation, evidence synthesis. Several topics such as diarrhea, aspiration, high gastric residual volume, abdominal distension, etc. were assessed by evidence-based methodology and Delphi method. After two rounds of expert investigations, Expert consensus on prevention and management of enteral nutrition therapy complications for critically ill patients in China (2021 edition) developed, and provided guidance for clinical medical staff.
RESUMEN
Objective:To study the feasibility of using bedside ultrasound in evaluating gastric residual volume in critical ill patients with enteral nutrition support.Methods:From May 2019 to August 2019, 60 patients were selected to receive enteral nutrition via gastric tube in ICU of Union Hospital, Tongji Medical College, Huazhong University of Science and Technology. Patients were divided into the experimental group and the control group according to the odd and even number of beds, 30 patients in the experimental group with odd number of beds and 30 patients in the control group with even number of beds. Gastric residual volume was evaluated at 0, 4, 8, 12, 16, 20, 24 h of enteral nutrition. In the experimental group, the gastric residual volume was evaluated by bedside ultrasound and syringe suction at each time point. In the control group, only bedside ultrasound was used to evaluate gastric residual volume. The results of operation time, monitoring results at different time points, diarrhea and the utilization rate of gastrointestinal motility drugs target feeding time, vomiting, were compared between the two groups.Results:There was no statistical difference between the gastric residual amount monitored by ultrasound and the gastric residual amount monitored by suction ( P>0.05). The operating time of bedside ultrasound monitoring was (62.40 ± 4.00) s, the operating time of suction monitoring was (78.39 ± 12.15) s, and the operating time of bedside ultrasound monitoring was less than that of suction ( t value was 6.633, P<0.01). There was no significant difference in the rate of vomiting, diarrhea and gastrointestinal motility drugs between the two groups( P>0.05). The time to reach the target feeding amount in the control group was (3.04 ± 0.31) d, and the time to reach the target feeding amount in the experimental group was (4.19 ± 0.33) d. The time to reach the target feeding amount in the control group was less than that in the experimental group ( t value was 13.42, P<0.01). Conclusions:Bedside ultrasound can be used to evaluate the residual gastric volume of enteral nutrition support patients, guide the implementation of enteral nutrition, shorten the operation time, reduce the workload of nurses, and avoid the contamination of enteral nutrition preparation.
RESUMEN
Objective:To investigate the difference between the concentration of the peripheral ionized calcium (iCa) monitored at different blood collection points and the target concentration of anticoagulant efficacy in patients with continuous renal replacement therapy (CRRT), so as to provide scientific basis for the best blood collection point in clinical practice.Methods:Taking patients of department of critical care medicine, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology receiving CRRT therapy with 4% citrate anticoagulation as research objects. Type of Prisma-FlexV8CRRT as well as department self-made substituate and dialyzate were adopted for all patients receiving CRRT therapy. Patients were divided into continuous veno- venous hemofiltration (CVVH) group ( n=10) and continuous veno- venous hemodialysis (CVVHD) group ( n=30) depending on their actual conditions and treatment needs. Blood collection was conducted at specific sites for extracorporeal peripheral ionized calcium concentration determination before and after the filter at the time of 2, 4, 8, 14, 20 hours following CRRT therapy for patients from both groups. Target concentration of extracorporeal peripheral ionized calcium was set as 0.2-0.4mmol/L for ensuring the efficacy of extracorporeal citrate anticoagulation. Results:Totally 400 testing results were obtained from 40 included patients during their treatment. In CVVH group, 100 testing results were obtained at the time of 2, 4, 8, 14, 20 hours following CRRT therapy and no significant statistical difference was shown ( P>0.05). In CVVHD group, 300 testing results were obtained at the time of 2, 4, 8, 14, 20 hours following CRRT therapy. The iCa concentration before the filter were (0.53±0.01), (0.50±0.01), (0.52±0.01), (0.53±0.01), (0.53±0.02) mmol/L while the iCa concentration after the filter were (0.41±0.01), (0.40±0.01), (0.39±0.02), (0.41±0.01), (0.40±0.01) mmol/L accordingly, and the difference was statistically significant ( t values were 75.24-103.41, P<0.01). Conclusions:For patients receiving CRRT treatment with citrate anticoagulation in different CRRT mode, testing results obtained from blood collection sites before and after the filter could not reflect the efficacy of citrate anticoagulation correctly and simultaneously to ensure the secure use of the extracorporeal pipelines and filter. In CVVH mode, iCa concentration determined from blood samples collected from sites before and after the filter could refelct the efficacy of citrate anticoagulation equally, while in CVVHD mode, blood collection and determination are suggested to conduct at the site before the filter to faciliate the assessment of the citrate anticoagulation efficacy.
RESUMEN
Objective@#The aim of this study is to investigate the effect of sublingual immunotherapy with Dermatophagoides Farinae on the expression of specific IgG4(sIgG4) in patients with allergic rhinitis (AR) in Hainan area.@*Method@#Seventy-two patients with dust-mite allergic rhinitis, all three generations of whom were local islanders in Hainan, were randomly divided into control group(36 cases) and SLIT group(36 cases). sIgG4 and sIgE expression levels were detected before treatment, 6 months after treatment, 12 months after treatment, and 18 months after treatment. The patient's symptom score, medication score, VAS score and adverse reactions was also assessed. Finally, through statistical analysis of the relevant data collected at 4 time points in the two groups of patients, the efficacy, safety and changes of sIgG4 antibody expression level in patients with allergic rhinitis receiving sublingual specific immunotherapy in Hainan were observed. @*Result@#Symptoms scores, medication scores and VAS scores were significantly improved in the SLIT group after treatment compared with before treatment(P<0.05), and serum sIgG4 increased significantly(P<0.01), serum sIgE showed no significant change(P>0.05). In the control group, symptom scores, medication scores and VAS scores were also significantly improved compared with before treatment(P<0.05), while serum sIgG4 and sIgE showed no significant change(P>0.05). When comparing the two groups, Symptoms scores, medication scores and VAS scores of the SLIT group were significantly lower than those of the control group at 12 months and 18 months after treatment(P<0.05). sIgG4 expression levels in the SLIT group were significantly higher than those in the control group after 6, 12 and 18 months of treatment(P<0.01). There was no significant difference in sIgE expression level between the two groups(P>0.05). No severe systemic adverse reactions occurred in the two groups, and 3 patients showed mild adverse reactions in the SLIT group. @*Conclusion@#Sublingual immunotherapy of Dermatophagoides Farinae was effective and could increase the expression of sIgG4 in patients with Dermatophagoides farinae AR, sIgG4 is expected to be an immunological marker for the objective evaluation of the clinical efficacy of Hapten.
RESUMEN
ObjectiveTo investigate the prevalence of nonalcoholic fatty liver disease (NAFLD) and related abnormal indicators among taxi drivers in Shenzhen, China, and to provide a basis for scientific prevention and treatment of fatty liver disease. MethodsA total of 1752 taxi drivers who underwent physical examination in Shenzhen Longhua District People’s Hospital from May 2018 to June 2019 were selected, and related indicators were measured, including body mass index (BMI), blood pressure, fasting plasma glucose (FPG), total cholesterol (TC), triglyceride (TG), high-density lipoprotein (HDL), low-density lipoprotein (LDL), alanine aminotransferase (ALT), aspartate aminotransferase (AST), and uric acid (UA). Liver ultrasound examination was also performed. The association between the prevalence rate of NAFLD and various biochemical parameters was analyzed. The t-test was used for comparison of continuous data between two groups, and the chi-square test was used for comparison of categorical data between two groups. ResultsThe prevalence rate of NAFLD among the taxi drivers was 51.66% (905/1752), and male drivers had a significantly higher prevalence rate than female drivers[57.94% (770/1329) vs 31.91% (135/423), χ2=9.209, P=0.027]. The taxi drivers with NAFLD had significantly higher abnormal rates of BMI, blood lipids, blood pressure, FPG, and UA than those without NAFLD (χ2=5.894, 7.126, 8.045, 8.909, and 10.373, P=0.047, 0.035, 0.030, 0.028, and 0.018). The taxi drivers with a BMI of ≥28 kg/m2 had a significantly higher prevalence rate of NAFLD than those with a BMI of 24.0-27.9 kg/m2 or a BMI of <24 kg/m2 (male: χ2=7.904 and 18.624, P=0.035 and 0.008; female: χ2=8.613 and 31.635, P=0.029 and 0.006). The taxi drivers with working years of >15 years had a significantly higher prevalence rate of NAFLD than those with working years of 11-15 years, 5-10 years, and <5 years (male: χ2=9.781, 13.546, and 18.052, P=0.024, 0.012, and 0.008; female: χ2=7.052, 9.847, and 12.157, P=0.036, 0.023, and 0.016). ConclusionThere is a high prevalence rate of NAFLD among taxi drivers in Shenzhen, and male drivers have a higher prevalence rate than female drivers. The prevalence rate of NAFLD is associated with the abnormal rates of hyperlipidemia, obesity, hyperglycemia, and hyperuricemia and the working years in driving.
RESUMEN
Bedside ultrasound plays an important role in the evaluation of critically ill patients. In order to standardize the application of bedside ultrasound, Chinese Research Hospital Association of Critical Care Medicine and Nursing Research Group of Chinese Research Hospital Association of Critical Care Medicine organized the experts in related fields in China to analyze, discuss and summarize the following contents: ① bedside ultrasound assessment of lungs; ② bedside ultrasound -guided nutrition tube placement; ③ bedside ultrasound assessment of gastric residual volume; ④ bedside ultrasound -guided endovascular catheterization. Finally, the Evidence-based nursing expert consensus on adult bedside ultrasound was formulated.
RESUMEN
Objective:The aim of this study was to study the characteristics of neoplasm invasion type in ovary of two orthotopic models established with human epithelial ovarian cancer solid tumor tissue slices and human ovarian carcinoma cell line OVCAR-3 in nude mice and human epithelial ovarian cancer.Methods:Tumor tissues and cell line OVCAR-3 of human epithelial ovarian cancer were grown in subcutaneous tissue and the subcutaneous tumor source was fetched and inoculated in ovarian capsule of nude mice to establish the orthotopic implantation model. The neoplasm invasion type in the two kinds of models were observed. The neoplasm invasion types were also analyzed by pathological examination in 54 cases of International Federation of Gynecology and Obstetrics (FIGO) stageⅠ-Ⅱepithelial ovarian cancer.Results:Three neoplasm invasion types were found as follows: type of pseudocapsule, type of pseudocapsule invasion, type of pseudocapsule penetration. Pseudocapsule rate in the solid tumor slices group (18.2%) were lower than those in the cell line group (42.3%) ( P<0.05), while the pseudocapsule penetration rate in the solid tumor slices (50.0%) were higher than those in the cell line group (23.1%) ( P<0.05). No difference was found of pseudocapsule invasion rate between two groups ( P>0.05). Neoplasm invasion type in ovary changed with tumor planting time. High proportion of pseudocapsule type was found at the beginning of tumor planting, and the pseudocapsule penetration rate raised with tumor planting time increased. High proportion of pseudocapsule type was also found in patients with FIGO stageⅠepithelial ovarian cancer, and pseudocapsule penetration rate increased in those with FIGO stageⅡ. No difference in neoplasm invasion type was found between two kinds of pathological types ( P>0.05). Conclusions:There are differences between the two kinds of orthotopic models established with human epithelial ovarian cancer solid tumor tissue slices and human ovarian carcinoma cell line OVCAR-3. Compared to the solid tumor slices model, the cell line model is more stable for the follow-up study. The proportion of three neoplasm invasion types in ovary were more balanced in 8 weeks after tumor planting, and 8 weeks after tumor planting is the best start time for the follow-up experiment.
RESUMEN
Glioblastoma (GBM) is one of the most common tumors of the central nervous system, which is the most lethal brain cancer. GBM treatment is based primarily on surgical resection, combined with radiotherapy and chemotherapy. Despite the positive treatment, progression free survival and overall survival were not significantly prolonged because GBM almost always recurs. We are always looking forward to some new and effective treatments. In recent years, a novel treatment method called tumor treating fields (TTFields) for cancer treatment has been proposed. TTFields devices were approved by the Food and Drug Administration (FDA) for adjuvant treatment of recurrent and newly diagnosed GBMs in 2011 and 2015, respectively. This became the first breakthrough treatment for GBM in the past 10 years after the FDA approved bevacizumab for patients with relapsed GBM in 2009. This paper summarized the research results of TTFields in recent years and elaborated the mechanism of action of TTFields on GBM, including cell and animal experimental research, clinical application and social benefits.