Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
J Environ Manage ; 365: 121538, 2024 Jun 20.
Article in English | MEDLINE | ID: mdl-38905798

ABSTRACT

The current study focuses on analyzing the impacts of climate change and land use/land cover (LULC) changes on sediment yield in the Puthimari basin, an Eastern Himalayan sub-watershed of the Brahmaputra, using a hybrid SWAT-ANN model approach. The analysis was meticulously segmented into three distinct time spans: 2025-2049, 2050-2074, and 2075-2099. This innovative method integrates insights from multiple climate models under two Representative Concentration Pathways (RCP4.5 and RCP8.5), along with LULC projections generated through the Cellular Automata Markov model. By combining the strengths of the Soil and Water Assessment Tool (SWAT) and artificial neural network (ANN) techniques, the study aims to improve the accuracy of sediment yield simulations in response to changing environmental conditions. The non-linear autoregressive with external input (NARX) method was adopted for the ANN component of the hybrid model. The adoption of the hybrid SWAT-ANN approach appears to be particularly effective in improving the accuracy of sediment yield simulation compared to using the SWAT model alone, as evidenced by the higher coefficient of determination value of 0.74 for the hybrid model compared to 0.35 for the standalone SWAT model. In the context of the RCP4.5 scenario, during 2075-99, the study noted a 29.34% increase in sediment yield, accompanied by simultaneous rises of 42.74% in discharge and 27.43% in rainfall during the Indian monsoon season, spanning from June to September. In contrast, under the RCP8.5 scenario, for the same period, the increases in sediment yield, discharge, and rainfall for the monsoon season were determined to be 116.56%, 103.28%, and 64.72%, respectively. The present study's comprehensive analysis of the factors influencing sediment supply in the Puthimari River basin fills an important knowledge gap and provides valuable insights for designing proactive flood and erosion management strategies. The findings from this research are crucial for understanding the vulnerability of the Puthimari basin to climate and land use changes, and by incorporating these findings into policy and decision-making processes, stakeholders can work towards enhancing resilience and sustainability in the face of future hydrological and environmental challenges.

2.
Adv Biomed Res ; 12: 229, 2023.
Article in English | MEDLINE | ID: mdl-38073735

ABSTRACT

Background: Maintaining normal left ventricular geometry and function depends on the mitral valve's normal integrity. Irreparable damage to the mitral valve calls for its replacement using either a valve made up of biological tissue or metal, pyrolytic carbon, and similar materials. Materials and Methods: The material consists of 50 formalin-fixed adults, seemingly normal cadaveric hearts of either sex which were received from the Department of Anatomy of various institutes in the north region. These hearts were cut open to access the mitral valve in the left ventricle. Results: In this study, the posterior leaflet was semi-oval in shape being 3.72 cm wide at the base. Usually said to be tri-scalloped, interestingly, it was found so only in 56% of the hearts; being bi-scalloped in 20% and single-cusped in 16% of the hearts. Even four scallops and six scallops were observed in three (6%) and one (2%) hearts, respectively. Conclusions: To conclude, notable variation has been seen in the scallops of posterolateral cusps in the present study. The number of scallops varies greatly as single, double, three, four, or tetra-scalloped and most significant six or hexa-scalloped which has never been reported in the previous studies. To understand the rationale behind each unique architectural layout, such noticeable variations are crucial for scientists around the world. Cardiothoracic surgeons could find this information valuable for mitral valve surgery repair.

3.
Comput Intell Neurosci ; 2023: 2002855, 2023.
Article in English | MEDLINE | ID: mdl-37868750

ABSTRACT

A brain tumor is a serious malignant condition caused by unregulated as well as aberrant cell partitioning. Recent advances in deep learning have aided the healthcare business, particularly, diagnostic imaging for the diagnosis of numerous disorders. The most frequent and widely utilized machine learning model for image recognition is probably task CNN. Similarly, in our study, we categorize brain MRI scanning images using CNN and data augmentation and image processing techniques. We compared the performance of the scratch CNN model with that of pretrained VGG-16 models using transfer learning. Even though the investigation is carried out on a small dataset, the results indicate that our model's accuracy is quite successful and has extremely low complexity rates, achieving 100 percent accuracy compared to 96 percent accuracy for VGG-16. Compared to existing pretrained methods, our model uses much less processing resources and produces substantially greater accuracy.


Subject(s)
Brain Neoplasms , Machine Learning , Humans , Magnetic Resonance Imaging , Neuroimaging , Brain Neoplasms/diagnostic imaging
4.
Cureus ; 14(4): e24489, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35651457

ABSTRACT

Coronary artery calcification represents one of the challenging and demanding subsets of percutaneous coronary intervention (PCI). Accumulated evidence has aptly outlined that shockwave intravascular lithotripsy (IVL) is a reliable tool to overcome calcified stenosis in coronary arteries. However, there is a lack of published case reports in the Indian context. Herein, we describe a case of right coronary artery (RCA) calcifications successfully managed with shock wave IVL-assisted staged PCI. Initially, a 74-year-old male patient presented with ST-segment elevation myocardial infarction (STEMI). At that time, coronary angiography demonstrated calcific thrombotic occlusion in the left anterior descending artery (LAD) and stenosis in proximal and mid tubular RCA. It was decided to proceed with immediate PCI of LAD followed by staged PCI of RCA. The patient presented with unstable angina at the time of the second repeat PCI of RCA and was managed with shock wave IVL-assisted staged PCI. Ultimately, the patient's condition was improved with good thrombolysis in myocardial infarction (TIMI) flow.

5.
Comput Intell Neurosci ; 2022: 3820360, 2022.
Article in English | MEDLINE | ID: mdl-35463255

ABSTRACT

An active research area where the experts from the medical field are trying to envisage the problem with more accuracy is diabetes prediction. Surveys conducted by WHO have shown a remarkable increase in the diabetic patients. Diabetes generally remains in dormant mode and it boosts the other diseases if patients are diagnosed with some other disease such as damage to the kidney vessels, problems in retina of the eye, and cardiac problem; if unidentified, it can create metabolic disorders and too many complications in the body. The main objective of our study is to draw a comparative study of different classifiers and feature selection methods to predict the diabetes with greater accuracy. In this paper, we have studied multilayer perceptron, decision trees, K-nearest neighbour, and random forest classifiers and few feature selection techniques were applied on the classifiers to detect the diabetes at an early stage. Raw data is subjected to preprocessing techniques, thus removing outliers and imputing missing values by mean and then in the end hyperparameters optimization. Experiments were conducted on PIMA Indians diabetes dataset using Weka 3.9 and the accuracy achieved for multilayer perceptron is 77.60%, for decision trees is 76.07%, for K-nearest neighbour is 78.58%, and for random forest is 79.8%, which is by far the best accuracy for random forest classifier.


Subject(s)
Diabetes Mellitus , Machine Learning , Cluster Analysis , Diabetes Mellitus/diagnosis , Humans , Neural Networks, Computer
6.
J Healthc Eng ; 2022: 8100697, 2022.
Article in English | MEDLINE | ID: mdl-35449835

ABSTRACT

Diabetes is a chronic disease characterized by a high amount of glucose in the blood and can cause too many complications also in the body, such as internal organ failure, retinopathy, and neuropathy. According to the predictions made by WHO, the figure may reach approximately 642 million by 2040, which means one in a ten may suffer from diabetes due to unhealthy lifestyle and lack of exercise. Many authors in the past have researched extensively on diabetes prediction through machine learning algorithms. The idea that had motivated us to present a review of various diabetic prediction models is to address the diabetic prediction problem by identifying, critically evaluating, and integrating the findings of all relevant, high-quality individual studies. In this paper, we have analysed the work done by various authors for diabetes prediction methods. Our analysis on diabetic prediction models was to find out the methods so as to select the best quality researches and to synthesize the different researches. Analysis of diabetes data disease is quite challenging because most of the data in the medical field are nonlinear, nonnormal, correlation structured, and complex in nature. Machine learning-based algorithms have been ruled out in the field of healthcare and medical imaging. Diabetes mellitus prediction at an early stage requires a different approach from other approaches. Machine learning-based system risk stratification can be used to categorize the patients into diabetic and controls. We strongly recommend our study because it comprises articles from various sources that will help other researchers on various diabetic prediction models.


Subject(s)
Diabetes Mellitus , Algorithms , Exercise , Humans , Machine Learning
7.
Indian J Nucl Med ; 31(2): 108-13, 2016.
Article in English | MEDLINE | ID: mdl-27095858

ABSTRACT

INTRODUCTION: It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. MATERIALS AND METHODS: This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. RESULTS: The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. CONCLUSION: It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

8.
Indian J Nucl Med ; 31(1): 20-6, 2016.
Article in English | MEDLINE | ID: mdl-26917889

ABSTRACT

PURPOSE: The role of (18)fluorodeoxyglucose positron emission tomography (PET) is limited for detection of primary hepatocellular carcinoma (HCC) due to low contrast to the tumor, and normal hepatocytes (background). The aim of the present study was to improve the contrast between the tumor and background by standardizing the input parameters of a digital contrast enhancement technique. MATERIALS AND METHODS: A transverse slice of PET image was adjusted for the best possible contrast, and saved in JPEG 2000 format. We processed this image with a contrast enhancement technique using 847 possible combinations of input parameters (threshold "m" and slope "e"). The input parameters which resulted in an image having a high value of 2(nd) order entropy, and edge content, and low value of absolute mean brightness error, and saturation evaluation metrics, were considered as standardized input parameters. The same process was repeated for total nine PET-computed tomography studies, thus analyzing 7623 images. RESULTS: The selected digital contrast enhancement technique increased the contrast between the HCC tumor and background. In seven out of nine images, the standardized input parameters "m" had values between 150 and 160, and for other two images values were 138 and 175, respectively. The value of slope "e" was 4 in 4 images, 3 in 3 images and 1 in 2 images. It was found that it is important to optimize the input parameters for the best possible contrast for each image; a particular value was not sufficient for all the HCC images. CONCLUSION: The use of above digital contrast enhancement technique improves the tumor to background ratio in PET images of HCC and appears to be useful. Further clinical validation of this finding is warranted.

9.
Indian J Nucl Med ; 30(2): 128-34, 2015.
Article in English | MEDLINE | ID: mdl-25829730

ABSTRACT

OBJECTIVE: Accuracy of in vivo activity quantification improves after the correction of penetrated and scattered photons. However, accurate assessment is not possible with physical experiment. We have used Monte Carlo Simulation to accurately assess the contribution of penetrated and scattered photons in the photopeak window. MATERIALS AND METHODS: Simulations were performed with Simulation of Imaging Nuclear Detectors Monte Carlo Code. The simulations were set up in such a way that it provides geometric, penetration, and scatter components after each simulation and writes binary images to a data file. These components were analyzed graphically using Microsoft Excel (Microsoft Corporation, USA). Each binary image was imported in software (ImageJ) and logarithmic transformation was applied for visual assessment of image quality, plotting profile across the center of the images and calculating full width at half maximum (FWHM) in horizontal and vertical directions. RESULTS: The geometric, penetration, and scatter at 140 keV for low-energy general-purpose were 93.20%, 4.13%, 2.67% respectively. Similarly, geometric, penetration, and scatter at 140 keV for low-energy high-resolution (LEHR), medium-energy general-purpose (MEGP), and high-energy general-purpose (HEGP) collimator were (94.06%, 3.39%, 2.55%), (96.42%, 1.52%, 2.06%), and (96.70%, 1.45%, 1.85%), respectively. For MEGP collimator at 245 keV photon and for HEGP collimator at 364 keV were 89.10%, 7.08%, 3.82% and 67.78%, 18.63%, 13.59%, respectively. CONCLUSION: Low-energy general-purpose and LEHR collimator is best to image 140 keV photon. HEGP can be used for 245 keV and 364 keV; however, correction for penetration and scatter must be applied if one is interested to quantify the in vivo activity of energy 364 keV. Due to heavy penetration and scattering, 511 keV photons should not be imaged with HEGP collimator.

11.
Indian J Nucl Med ; 29(4): 235-40, 2014 Oct.
Article in English | MEDLINE | ID: mdl-25400362

ABSTRACT

PURPOSE: Acquisition of higher counts improves visual perception of positron emission tomography-computed tomography (PET-CT) image. Larger radiopharmaceutical doses (implies more radiation dose) are administered to acquire this count in a short time period. However, diagnostic information does not increase after a certain threshold of counts. This study was conducted to develop a post processing method based on principle of "stochastic resonance" to improve visual perception of the PET-CT image having a required threshold counts. MATERIALS AND METHODS: PET-CT images (JPEG file format) with low, medium, and high counts in the image were included in this study. The image was corrupted with the addition of Poisson noise. The amplitude of the Poisson noise was adjusted by dividing each pixel by a constant 1, 2, 4, 8, 16, and 32. The best amplitude of the noise that gave best images quality was selected based on high value of entropy of the output image, high value of structural similarity index and feature similarity index. Visual perception of the image was evaluated by two nuclear medicine physicians. RESULTS: The variation in structural and feature similarity of the image was not appreciable visually, but statistically images deteriorated as the noise amplitude increases although maintaining structural (above 70%) and feature (above 80%) similarity of input images in all cases. We obtained the best image quality at noise amplitude "4" in which 88% structural and 95% feature similarity of the input images was retained. CONCLUSION: This method of stochastic resonance can be used to improve the visual perception of the PET-CT image. This can indirectly lead to reduction of radiation dose.

12.
Indian J Nucl Med ; 28(2): 75-8, 2013 Apr.
Article in English | MEDLINE | ID: mdl-24163510

ABSTRACT

OBJECTIVE: It is important to ensure that as low as reasonably achievable (ALARA) concept during the radiopharmaceutical (RPH) dose administration in pediatric patients. Several methods have been suggested over the years for the calculation of individualized RPH dose, sometimes requiring complex calculations and large variability exists for administered dose in children. The aim of the present study was to develop a software application that can calculate and store RPH dose along with patient record. MATERIALS AND METHODS: We reviewed the literature to select the dose formula and used Microsoft Access (a software package) to develop this application. We used the Microsoft Excel to verify the accurate execution of the dose formula. The manual and computer time using this program required for calculating the RPH dose were compared. RESULTS: The developed application calculates RPH dose for pediatric patients based on European Association of Nuclear Medicine dose card, weight based, body surface area based, Clark, Solomon Fried, Young and Webster's formula. It is password protected to prevent the accidental damage and stores the complete record of patients that can be exported to Excel sheet for further analysis. It reduces the burden of calculation and saves considerable time i.e., 2 min computer time as compared with 102 min (manual calculation with the calculator for all seven formulas for 25 patients). CONCLUSION: The software detailed above appears to be an easy and useful method for calculation of pediatric RPH dose in routine clinical practice. This software application will help in helping the user to routinely applied ALARA principle while pediatric dose administration.

13.
J Gastroenterol Hepatol ; 22(11): 1909-15, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17914969

ABSTRACT

BACKGROUND: It is currently recommended that all patients with liver cirrhosis undergo upper gastrointestinal endoscopy (UGIE) to identify those who have large esophageal varices (LEVx) that carry a high risk of bleeding and may benefit from prophylactic measures. This approach leads to unnecessary UGIE in those without LEVx. We tried to identify clinical, laboratory and imaging parameters that may predict the presence of LEVx and help select patients for UGIE. METHODS: This prospective study included newly diagnosed patients with cirrhosis and no history of gastrointestinal bleeding scheduled to undergo UGIE. Patients underwent detailed clinical examination, blood tests (hematology, liver function tests) and ultrasonography. Size of esophageal varices was assessed at UGIE; Paquet's grades 0-II were classified as small varices, and III-IV as LEVx. Association of LEVx with qualitative and quantitative parameters was studied using chi(2) and Mann-Whitney U-tests, respectively. Parameters found to be significant were tested in a forward-conditional multivariate logistic regression analysis to identify independent predictors. Receiver operating characteristic curve analysis was used to assess the efficacy of prediction models. RESULTS: Of the 101 patients (median age 45; range 15-74 years; 87 male; Child-Pugh class: A 18, B 31, C 52), 46 had LEVx. On univariate analysis, five variables were significantly associated with the presence of LEVx. These included pallor (P = 0.026), palpable spleen (P = 0.009), platelet count (P < 0.002), total leukocyte count (P < 0.0004) and liver span on ultrasound (P = 0.031). On multivariate analysis, two of these parameters, namely low platelet count and presence of palpable spleen, were found to be independent predictors of the presence of LEVx. A receiver-operating characteristics curve using the predictor function arrived at from this analysis had an area under the curve of 0.760. CONCLUSION: Presence of palpable spleen and low platelet count are independent predictors of presence of LEVx in patients with cirrhosis. Use of these parameters may help identify patients with a low probability of LEVx who may not need UGIE. This may help reduce costs and discomfort for these patients and the burden on endoscopy units.


Subject(s)
Esophageal and Gastric Varices/diagnosis , Liver Cirrhosis/complications , Palpation , Platelet Count , Spleen/pathology , Adolescent , Adult , Aged , Esophageal and Gastric Varices/blood , Esophageal and Gastric Varices/diagnostic imaging , Esophageal and Gastric Varices/etiology , Esophageal and Gastric Varices/pathology , Esophagoscopy , Female , Humans , India , Leukocyte Count , Liver/diagnostic imaging , Liver Cirrhosis/blood , Liver Cirrhosis/diagnostic imaging , Liver Cirrhosis/pathology , Male , Middle Aged , Patient Selection , Predictive Value of Tests , Prospective Studies , ROC Curve , Sensitivity and Specificity , Severity of Illness Index , Spleen/diagnostic imaging , Ultrasonography , Unnecessary Procedures
SELECTION OF CITATIONS
SEARCH DETAIL
...