Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
AANA J ; 87(1): 29-36, 2019 Feb.
Article in English | MEDLINE | ID: mdl-31587741

ABSTRACT

Chronic pain is a growing epidemic in America. Challenges in patients' access to care, and in reimbursement to Certified Registered Nurse Anesthetists (CRNAs) who provide pain services, have resulted in a voluntary subspecialty certification in nonsurgical pain management (NSPM) for CRNAs. An evaluation was conducted of perceptions of CRNAs toward the value of certification in NSPM. An invitation to complete the Perceived Value of Certification Tool (PVCT) was sent to 474 CRNAs who identified the subspecialty practice of NSPM upon application for recertification to the NBCRNA. Data were collected on 18 factors related to the perceived value of certification in the NSPM subspecialty. Exploratory factor analysis using principal components analysis with varimax rotation was conducted to assess the latent structure of the PVCT and to identify potential constructs of CRNAs' perceptions. Reliability was assessed using Cronbach α coefficients. Of 64 CRNAs who provided data, a 3-factor solution emerged that explained 72.25% of the overall variance: personal satisfaction, professional recognition, and competence, each with excellent to good reliability (F1: α = 0.95, F2: α = 0.94, F3: α = 0.88). Identification of the 3 constructs in this study will assist with future efforts of examination validation for the subspecialty of NSPM certification for CRNAs.


Subject(s)
Attitude of Health Personnel , Certification , Chronic Pain/therapy , Nurse Anesthetists , Pain Management , Chronic Pain/nursing , Humans , Reproducibility of Results , Surveys and Questionnaires , United States
2.
Eval Health Prof ; 42(1): 82-102, 2019 03.
Article in English | MEDLINE | ID: mdl-28727944

ABSTRACT

Simulation for education and training in health-care professions has been widely applied. However, its value as an assessment tool for competence is not fully known. Logistical barriers of simulation-based assessments have led some health-care organizations to utilize computer-based case simulations (CCSs) for assessment. This article provides a review of the literature on the identification of psychometrically sound, CCS instruments designed to measure decision-making competence in health-care professionals. CINAHL, MEDLINE, and Ovid databases identified 84 potentially relevant articles published between January 2000 and May 2017. A total of 12 articles met criteria for inclusion in this review. Findings of these 12 articles indicate that summative assessment in health care using CCSs in the form of clinical scenarios is utilized to assess higher order performance aspects of competence in the form of decision-making. Psychometric strength was validated in eight articles and supported by four replication studies. Two of the eight articles reported evidence of construct validity and support the need for evidence based on a theoretical framework. This literature review offers implications for further research on the use of CCS tools as a method for assessment of competence in health-care professionals and the need for psychometric evidence to support it.


Subject(s)
Clinical Competence , Computer Simulation , Clinical Competence/standards , Clinical Competence/statistics & numerical data , Clinical Decision-Making , Health Personnel/standards , Humans , Psychometrics , Reproducibility of Results
3.
J Appl Meas ; 17(4): 489-501, 2016.
Article in English | MEDLINE | ID: mdl-28009594

ABSTRACT

Alternative items were added as scored items to the National Certification Examination for Nurse Anesthetists (NCE) in 2010. A common concern related to the new items has been their measurement attributes. This study was undertaken to evaluate the psychometric impact of adding these items to the examination. Candidates had a significantly higher ability estimate in alternative items than in multiple choice questions and 6.7 percent of test candidates performed significantly differently in alternative item formats. The ability estimates of multiple choice questions correlated at r = .58. The alternative items took significantly longer time to answer than standard multiple choice questions and discriminated to a higher degree than MCQs. The alternative items exhibited unidimensionality to the same degree as MCQs and the BIC confirmed the Rasch model as acceptable for scoring. The new item types were found to have acceptable attributes for inclusion in the certification program.


Subject(s)
Certification/methods , Educational Measurement/methods , Nurse Anesthetists/psychology , Nurse Anesthetists/statistics & numerical data , Psychometrics/methods , Surveys and Questionnaires , Certification/statistics & numerical data , Clinical Competence/statistics & numerical data , Data Interpretation, Statistical , Humans , Models, Statistical , Reproducibility of Results , Sensitivity and Specificity , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...