Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
1.
Can Assoc Radiol J ; 72(3): 483-489, 2021 Aug.
Article in English | MEDLINE | ID: mdl-32162532

ABSTRACT

The Canadian Association of Radiologists and Osteoporosis Canada currently endorse a fracture risk prediction tool called CAROC. It has been used in Canada since 2005 with an update in 2010. It is an integral part of bone mineral densitometry reporting across the country. New osteoporosis guidelines from Osteoporosis Canada (OC) are expected in the near future. There has been pressure on radiologists to report fracture risk using an alternative fracture risk prediction platform called FRAX. In addition, OC collaborated in the development of the Canadian FRAX model and has been copromoting both FRAX and CAROC, raising the prospect that new guidelines may seek to replace CAROC with FRAX for fracture risk determination. A number of concerns have been raised about FRAX, including: (1) FRAX has not released its algorithms to the public domain with the consequence that it is impossible to verify results for an individual patient; (2) FRAX has incorrectly claimed that it was developed by the World Health Organization (WHO) and has used this affiliation to promote itself until recently ordered by the WHO to desist; (3) FRAX requires collection of additional clinical information beyond that needed for CAROC, and this patient-reported medical data is prone to substantial error; and (4) despite claims to the contrary, there are no valid studies comparing FRAX to CAROC. We believe it is important that radiologists be aware of these issues in order to provide input into future Technical Standards for Bone Mineral Densitometry Reporting of the Canadian Association of Radiologists.


Subject(s)
Fractures, Bone/etiology , Osteoporosis/complications , Osteoporosis/diagnostic imaging , Practice Guidelines as Topic , Risk Assessment/methods , Absorptiometry, Photon , Algorithms , Bone Density , Canada , Humans , Risk Assessment/standards , Risk Factors , Validation Studies as Topic , World Health Organization
2.
Int J Qual Health Care ; 28(3): 294-8, 2016 Jun.
Article in English | MEDLINE | ID: mdl-26892609

ABSTRACT

OBJECTIVE: To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program. DESIGN: Retrospective review of prospectively collected data. SETTING: Large private outpatient radiology practice. PARTICIPANTS: Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience. INTERVENTIONS: Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement). MAIN OUTCOME MEASURES: Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses. RESULTS: Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable. CONCLUSION: Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.


Subject(s)
Peer Review, Health Care/methods , Quality Assurance, Health Care/organization & administration , Radiology/organization & administration , Clinical Competence , Humans , Quality Assurance, Health Care/standards , Radiology/standards , Random Allocation , Retrospective Studies
4.
BMC Med Imaging ; 13: 19, 2013 Jul 04.
Article in English | MEDLINE | ID: mdl-23822583

ABSTRACT

BACKGROUND: The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice. METHODS: Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system. RESULTS: There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable. CONCLUSIONS: The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.


Subject(s)
Diagnostic Imaging/standards , Peer Review/standards , Professional Competence/standards , Quality Assurance, Health Care/standards , Radiology/standards , User-Computer Interface , Canada , Pilot Projects , Systems Integration
SELECTION OF CITATIONS
SEARCH DETAIL
...