z-logo
Premium
A Comparison of Data Sources for Motor Vehicle Crash Characteristic Accuracy
Author(s) -
Grant Robert J.,
Gregor Mary Ann,
Beck Paul W.,
Maio Ronald F.
Publication year - 2000
Publication title -
academic emergency medicine
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.221
H-Index - 124
eISSN - 1553-2712
pISSN - 1069-6563
DOI - 10.1111/j.1553-2712.2000.tb02067.x
Subject(s) - crash , inter rater reliability , medicine , kappa , emergency department , cohen's kappa , motor vehicle crash , observational study , medical record , reliability (semiconductor) , poison control , injury prevention , medical emergency , statistics , surgery , computer science , psychiatry , mathematics , rating scale , geometry , programming language , power (physics) , physics , quantum mechanics
. Objective: To determine the accuracy of police reports (PRs), ambulance reports (ARs), and emergency department records (EDRs) in describing motor vehicle crash (MVC) characteristics when compared with an investigation performed by an experienced crash investigator trained in impact biomechanics. Methods: This was a cross‐sectional, observational study. Ninety‐one patients transported by ambulance to a university emergency department (ED) directly from the scene of an MVC from August 1997 to April 1998 were enrolled. Potential patients were identified from the ED log and consent was obtained to investigate the crash vehicle. Data describing MVC characteristics were abstracted from the PR, AR, and medical record. Variables of interest included restraint use (RU), air bag deployment (AD), and type of impact (TI). Agreements between the variables and the independent crash investigation were compared using kappa. Interrater reliability was determined using kappa by comparing a random sample of 20 abstracted reports for each data source with the originally abstracted data. Results: Agreement using kappa between the crash investigation and each data source was 0.588 (95% CI = 0.508 to 0.667) for the PR, 0.330 (95% CI = 0.252 to 0.407) for the AR, and 0.492 (95% CI = 0.413 to 0.572) for the EDR. Variable agreement was 0.239 (95% CI = 0.164 to 0.314) for RU, 0.350 (95% CI = 0.268 to 0.432) for AD, and 0.631 (95%= 0.563 to 0.698) for TI. Interrater reliability was excellent (kappa > 0.8) for all data sources. Conclusions: The strength of the agreement between the independent crash investigation and the data sources that were measured by kappa was fair to moderate, indicating inaccuracies. This presents ramifications for researchers and necessitates consideration of the validity and accuracy of crash characteristics contained in these data sources.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here