Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Oral Oncol ; 145: 106480, 2023 10.
Article in English | MEDLINE | ID: mdl-37454545

ABSTRACT

OBJECTIVE: Oral squamous cell carcinoma (OSCC) and oropharyngeal squamous cell carcinoma (OPSCC) can go undetected resulting in late detection and poor outcomes. We describe the development and validation of CancerDetect for Oral & Throat cancer™ (CDOT), to detect markers of OSCC and/or OPSCC within a high-risk population. MATERIAL AND METHODS: We collected saliva samples from 1,175 individuals who were 50 years or older, or adults with a tobacco use history. 945 of those were used to train a classifier using machine learning methods, resulting in a salivary microbial and human metatranscriptomic signature. The classifier was then independently validated on the 230 remaining samples prospectively collected and unseen by the classifier, consisting of 20 OSCC (all stages), 76 OPSCC (all stages), and 134 negatives (including 14 pre-malignant). RESULTS: On the validation cohort, the specificity of the CDOT test was 94 %, sensitivity was 90 % for participants with OSCC, and 84.2 % for participants with OPSCC. Similar classification results were observed among people in early stage (stages I & II) vs late stage (stages III & IV). CONCLUSIONS: CDOT is a non-invasive test that can be easily administered in dentist offices, primary care centres and specialised cancer clinics for early detection of OPSCC and OSCC. This test, having received FDA's breakthrough designation for accelerated review, has the potential to enable early diagnosis, saving lives and significantly reducing healthcare expenditure.


Subject(s)
Carcinoma, Squamous Cell , Head and Neck Neoplasms , Mouth Neoplasms , Adult , Humans , Mouth Neoplasms/diagnosis , Mouth Neoplasms/genetics , Mouth Neoplasms/pathology , Carcinoma, Squamous Cell/pathology , Pharynx/pathology , Squamous Cell Carcinoma of Head and Neck , RNA , Saliva , Biomarkers, Tumor
2.
J Med Screen ; 30(4): 175-183, 2023 12.
Article in English | MEDLINE | ID: mdl-37264786

ABSTRACT

OBJECTIVES: To inform the development and evaluation of new blood-based colorectal cancer (CRC) screening tests satisfying minimum United States (US) coverage criteria, we estimated the impact of the different test performance characteristics on long-term testing benefits and burdens. METHODS: A novel CRC-Microsimulation of Adenoma Progression and Screening (CRC-MAPS) model was developed, validated, then used to assess different screening tests for CRC. We compared multiple, hypothetical blood-based CRC screening tests satisfying minimum coverage criteria of 74% CRC sensitivity and 90% specificity, to measure how changes in a test's CRC sensitivity, specificity, and adenoma sensitivity (sizes 1-5 mm, 6-9 mm, ≥10 mm) affect total number of colonoscopies (COL), CRC incidence reduction (IR), CRC mortality reduction (MR), and burden-to-benefit ratios (incremental COLs per percentage-point increase in IR or MR). RESULTS: A blood test meeting minimum US coverage criteria for performance characteristics resulted in 1576 lifetime COLs per 1000 individuals, 46.7% IR and 59.2% MR compared to no screening. Tests with increased CRC sensitivity of 99% ( + 25%) vs. increased ≥10 mm adenoma sensitivity of 13.6% ( + 3.6%) both yielded the same MR, 62.7%. Test benefits improved the most with increases in all-size adenoma sensitivity, then size-specific adenoma sensitivities, then specificity and CRC sensitivity, while increases in specificity or ≥10 mm adenoma sensitivity resulted in the most favorable burden-to-benefit tradeoffs (ratios <11.5). CONCLUSIONS: Burden-to-benefit ratios for blood-based CRC screening tests differ by performance characteristic, with the most favorable tradeoffs resulting from improvements in specificity and ≥10 mm adenoma sensitivity.


Subject(s)
Adenoma , Colorectal Neoplasms , Humans , United States , Early Detection of Cancer/methods , Colorectal Neoplasms/epidemiology , Colonoscopy , Adenoma/diagnosis , Occult Blood
3.
Clin Gastroenterol Hepatol ; 21(3): 704-712.e3, 2023 03.
Article in English | MEDLINE | ID: mdl-35337982

ABSTRACT

BACKGROUND & AIMS: Although liver transplantation (LT) has been demonstrated to provide survival benefit for patients with acute-on-chronic liver failure (ACLF), data are lacking regarding resource utilization for this population after LT. METHODS: We retrospectively reviewed data from 10 centers in North America of patients transplanted between 2018 and 2019. ACLF was identified by using the European Association for the Study of the Liver-Chronic Liver Failure criteria. RESULTS: We studied 318 patients of whom 106 patients (33.3%) had no ACLF, 61 (19.1%) had ACLF-1, 74 (23.2%) had ACLF-2, and 77 (24.2%) had ACLF-3 at transplantation. Healthcare resource utilization after LT was greater among recipients with ACLF compared with patients without ACLF regarding median post-LT length of hospital stay (LOS) (P < .001), length of post-LT dialysis (P < .001), discharge to a rehabilitation center (P < .001), and 30-day readmission rates (P = .042). Multivariable negative binomial regression analysis demonstrated a significantly longer LOS for patients with ACLF-1 (1.9 days; 95% confidence interval [CI], 0.82-7.51), ACLF-2 (6.7 days; 95% CI, 2.5-24.3), and ACLF-3 (19.3 days; 95% CI, 1.2-39.7), compared with recipients without ACLF. Presence of ACLF-3 at LT was also associated with longer length of dialysis after LT (9.7 days; 95% CI, 4.6-48.8) relative to lower grades. Multivariable logistic regression analysis revealed greater likelihood of discharge to a rehabilitation center among recipients with ACLF-1 (odds ratio [OR], 1.79; 95% CI, 1.09-4.54), ACLF-2 (OR, 2.23; 95% CI, 1.12-5.01), and ACLF-3 (OR, 2.23; 95% CI, 1.40-5.73). Development of bacterial infection after LT also predicted LOS (20.9 days; 95% CI, 6.1-38.5) and 30-day readmissions (OR, 1.39; 95% CI, 1.17-2.25). CONCLUSIONS: Patients with ACLF at LT, particularly ACLF-3, have greater post-transplant healthcare resource utilization.


Subject(s)
Acute-On-Chronic Liver Failure , Liver Transplantation , Humans , Acute-On-Chronic Liver Failure/complications , Liver Cirrhosis/complications , Retrospective Studies , Patient Acceptance of Health Care , Prognosis
4.
PLoS One ; 15(12): e0244431, 2020.
Article in English | MEDLINE | ID: mdl-33373409

ABSTRACT

BACKGROUND: Real-world adherence to colorectal cancer (CRC) screening strategies is imperfect. The CRC-AIM microsimulation model was used to estimate the impact of imperfect adherence on the relative benefits and burdens of guideline-endorsed, stool-based screening strategies. METHODS: Predicted outcomes of multi-target stool DNA (mt-sDNA), fecal immunochemical tests (FIT), and high-sensitivity guaiac-based fecal occult blood tests (HSgFOBT) were simulated for 40-year-olds free of diagnosed CRC. For robustness, imperfect adherence was incorporated in multiple ways and with extensive sensitivity analysis. Analysis 1 assumed adherence from 0%-100%, in 10% increments. Analysis 2 longitudinally applied real-world first-round differential adherence rates (base-case imperfect rates = 40% annual FIT vs 34% annual HSgFOBT vs 70% triennial mt-sDNA). Analysis 3 randomly assigned individuals to receive 1, 5, or 9 lifetime (9 = 100% adherence) mt-sDNA tests and 1, 5, or 9 to 26 (26 = 100% adherence) FIT tests. Outcomes are reported per 1000 individuals compared with no screening. RESULTS: Each screening strategy decreased CRC incidence and mortality versus no screening. In individuals screened between ages 50-75 and adherence ranging from 10%a-100%, the life-years gained (LYG) for triennial mt-sDNA ranged from 133.1-300.0, for annual FIT from 96.3-318.1, and for annual HSgFOBT from 99.8-320.6. At base-case imperfect adherence rates, mt-sDNA resulted in 19.1% more LYG versus FIT, 25.4% more LYG versus HSgFOBT, and generally had preferable efficiency ratios while offering the most LYG. Completion of at least 21 FIT tests is needed to reach approximately the same LYG achieved with 9 mt-sDNA tests. CONCLUSIONS: Adherence assumptions affect the conclusions of CRC screening microsimulations that are used to inform CRC screening guidelines. LYG from FIT and HSgFOBT are more sensitive to changes in adherence assumptions than mt-sDNA because they require more tests be completed for equivalent benefit. At imperfect adherence rates, mt-sDNA provides more LYG than FIT or HSgFOBT at an acceptable tradeoff in screening burden.


Subject(s)
Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/epidemiology , DNA/analysis , Early Detection of Cancer/methods , Aged , Aged, 80 and over , Colorectal Neoplasms/genetics , Female , Guideline Adherence , Humans , Incidence , Longitudinal Studies , Male , Middle Aged , Models, Theoretical , Mortality , Occult Blood , Practice Guidelines as Topic
5.
ACR Open Rheumatol ; 2(11): 629-639, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33044050

ABSTRACT

OBJECTIVE: Diagnosis of systemic lupus erythematosus (SLE) made by standard diagnostic laboratory tests (SDLTs) has sensitivity and specificity of 83% and 76%, respectively. A multivariate assay panel (MAP) combining complement C4d activation products on erythrocytes and B cells with SDLTs yields a sensitivity and specificity of 80% and 86%, respectively, presumably enabling earlier SLE diagnosis at lower severity, with associated lower health care costs compared with SDLT diagnoses. We compared the payer budget impact of diagnosing SLE using MAP (incremental cost of $108) versus SDLTs. METHODS: We modeled a health plan of 1 million enrollees. SLE diagnosis among suspected patients was 9.2%. The MAP arm assumed 80%/20% of patients were tested with MAP/SDLTs, versus 100% tested with SDLTs in the SDLT arm. Prediagnosis direct costs were estimated from claims data, and postdiagnosis costs were obtained from the literature. Based on improved MAP performance, the assumed hazard ratio for diagnosis rate compared with SDLTs was 1.74 (71%, 87%, 90%, and 91% of patients who develop SLE are diagnosed in years 1 to 4 compared with 53%, 75%, 84%, and 88% of patients diagnosed with SDLTs). RESULTS: Total 4-year pre- and postdiagnosis direct costs for patients with suspected SLE tested with MAP were $59 183 666 compared with $61 174 818 tested by SDLTs, with lower costs in the MAP arm due primarily to prediagnosis savings related to reduced hospital admissions. CONCLUSION: Incorporating MAP into SLE diagnosis results in estimated 4-year direct cost savings of $1 991 152 ($0.04 per member per month). By facilitating earlier diagnosis of SLE, MAP may enhance patient outcomes.

6.
Cancer Prev Res (Phila) ; 13(5): 443-448, 2020 05.
Article in English | MEDLINE | ID: mdl-32029430

ABSTRACT

Colorectal cancer is a growing burden in adults less than 50 years old. In 2018, the American Cancer Society published a guideline update recommending a reduction in the colorectal cancer screening start age for average-risk individuals from 50 to 45. Implementing these recommendations would have important implications for public health. However, the approximate number of people impacted by this change, the average-risk population ages 45-49, is not well-described in the literature. Here, we provide methodology to conservatively estimate the average-risk and screening-eligible population in the United States, including those who would be impacted by a lowered colorectal cancer screening start age. Using multiple data sources, we estimated the current average-risk population by subtracting individuals with symptomatic colorectal cancer, with a family history of colorectal cancer, and with inflammatory bowel disease and hereditary nonpolyposis colorectal cancer from the total population. Within this population, we estimated the number of screening-eligible individuals by subtracting those with previous colorectal cancer screening (45- to 49-year-old) or up to date with colorectal cancer screening (50- to 74-year-old). The total average-risk population is estimated between 102.1 and 106.5 million people, of whom 43.4-45.2 million people are eligible for colorectal cancer screening. Lowering the screening age would add roughly 19 million people to the average-risk population and increase the current number of screening-eligible individuals on immediate implementation by over 60% (from 27 to 44 million). Estimating the population size impacted by lowering the recommended colorectal cancer screening start age enables more accurate decision-making for policymakers and epidemiologists focused on cancer prevention.


Subject(s)
Colorectal Neoplasms/diagnosis , Early Detection of Cancer/standards , Practice Guidelines as Topic/standards , Risk Assessment/standards , Age Factors , Aged , Colorectal Neoplasms/epidemiology , Early Detection of Cancer/methods , Early Detection of Cancer/statistics & numerical data , Female , Humans , Male , Middle Aged , Preventive Health Services , Risk Factors , SEER Program , United States/epidemiology
7.
Hepatology ; 68(1): 78-88, 2018 07.
Article in English | MEDLINE | ID: mdl-29023828

ABSTRACT

Surveillance for hepatocellular carcinoma (HCC) has been recommended in patients with cirrhosis. In this study, we examined the extent to which the competing risk of hepatic decompensation influences the benefit of HCC surveillance by investigating the impact of availability of liver transplantation (LTx) and the rate of progression of hepatic decompensation on survival gain from HCC surveillance. A multistate Markov model was constructed simulating a cohort of 50-year-old patients with compensated cirrhosis. The primary outcome of interest was all-cause and HCC-specific mortality. The main input data included incidence of HCC, sensitivity of screening test, and mortality from hepatic decompensation. Treatment modalities modeled included LTx, resection, and radiofrequency ablation. In the base case scenario, LTx would be available to prevent death in a certain proportion of patients. In the absence of surveillance, 68.2% of the cohort members died within 15 years; of these decedents, 25.1% died from HCC and 43.6% died from hepatic decompensation. With surveillance, the median survival improved from 10.4 years to 11.2 years. The number of subjects under surveillance needed to reduce one all-cause and one HCC-specific death over 15 years was 28 and 18, respectively. In sensitivity analyses, incidence of HCC and progression of cirrhosis had the strongest effect on the benefit of surveillance, whereas LTx availability had a negligible effect. CONCLUSION: HCC surveillance decreases all-cause and tumor-specific mortality in patients with compensated cirrhosis regardless of LTx availability. In addition, incidence of HCC and sensitivity of surveillance test also had a substantial impact on the benefits of surveillance. (Hepatology 2018;68:78-88).


Subject(s)
Carcinoma, Hepatocellular/mortality , Liver Cirrhosis/mortality , Liver Neoplasms/mortality , Models, Theoretical , Population Surveillance , Carcinoma, Hepatocellular/diagnosis , Carcinoma, Hepatocellular/surgery , Humans , Liver Cirrhosis/complications , Liver Neoplasms/diagnosis , Liver Neoplasms/surgery , Liver Transplantation , Markov Chains , Middle Aged
8.
Clin Gastroenterol Hepatol ; 14(12): 1778-1787.e8, 2016 12.
Article in English | MEDLINE | ID: mdl-27464589

ABSTRACT

BACKGROUND & AIMS: Patients with chronic ulcerative colitis are at increased risk for colorectal neoplasia (CRN). Surveillance by white-light endoscopy (WLE) or chromoendoscopy may reduce risk of CRN, but these strategies are underused. Analysis of DNA from stool samples (sDNA) can detect CRN with high levels of sensitivity, but it is not clear if this approach is cost-effective. We simulated these strategies for CRN detection to determine which approach is most cost-effective. METHODS: We adapted a previously published Markov model to simulate the clinical course of chronic ulcerative colitis, the incidence of cancer or dysplasia, and costs and benefits of care with 4 surveillance strategies: (1) analysis of sDNA and diagnostic chromoendoscopy for patients with positive results, (2) analysis of sDNA with diagnostic WLE for patients with positive results, (3) chromoendoscopy with targeted collection of biopsies, or (4) WLE with random collection of biopsies. Costs were based on 2014 Medicare reimbursement. The primary outcome was the incremental cost-effectiveness ratio (incremental cost/incremental difference in quality-adjusted life-years) compared with no surveillance and a willingness-to-pay threshold of $50,000. RESULTS: All strategies fell below the willingness-to-pay threshold at 2-year intervals. Incremental cost-effectiveness ratios were $16,362 per quality-adjusted life-year for sDNA analysis with diagnostic chromoendoscopy; $18,643 per quality-adjusted life-year for sDNA analysis with diagnostic WLE; $23,830 per quality-adjusted life-year for chromoendoscopy alone; and $27,907 per quality-adjusted life-year for WLE alone. In sensitivity analyses, sDNA analysis with diagnostic chromoendoscopy was more cost-effective than chromoendoscopy alone, up to a cost of $1135 per sDNA test. sDNA analysis remained cost-effective at all rates of compliance; when combined with diagnostic chromoendoscopy, this approach was preferred over chromoendoscopy alone, when the specificity of the sDNA test for CRN was >65%. CONCLUSIONS: Based on a Markov model, surveillance for CRN is cost-effective for patients with chronic ulcerative colitis. Analysis of sDNA with chromoendoscopies for patients with positive results was more cost-effective than chromoendoscopy or WLE alone.


Subject(s)
Colitis, Ulcerative/complications , Colorectal Neoplasms/diagnosis , Cost-Benefit Analysis , DNA/analysis , Early Detection of Cancer/economics , Early Detection of Cancer/methods , Feces/chemistry , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...