Abstract
Background: Past and present national initiatives advocate for electronic exchange of health data and emphasize interoperability. The critical role of public health in the context of disease surveillance was recognized with recommendations for electronic laboratory reporting (ELR). Many public health agencies have seen a trend towards centralization of information technology services which adds another layer of complexity to interoperability efforts.
Objectives: To understand the process of data exchange and its impact on the quality of data being transmitted in the context of electronic laboratory reporting to public health. The study was conducted in context of Minnesota Electronic Disease Surveillance System (MEDSS), the public health information system for supporting infectious disease surveillance in Minnesota. Data Quality (DQ) dimensions by Strong et al., was chosen as the guiding framework for evaluation.
Methods: The process of assessing data exchange for electronic lab reporting and its impact was a mixed methods approach with qualitative data obtained through expert discussions and quantitative data obtained from queries of the MEDSS system. Interviews were conducted in an open-ended format from November 2017 through February 2018. Based on these discussions, two high level categories of data exchange process which could impact data quality were identified: onboarding for electronic lab reporting and internal data exchange routing. This in turn comprised of eight critical steps and its impact on quality of data was identified through expert input. This was followed by analysis of data in MEDSS by various criteria identified by the informatics team.
Results: All DQ metrics (Intrinsic DQ, Contextual DQ, Representational DQ, and Accessibility DQ) were impacted in the data exchange process with varying influence on DQ dimensions. Some errors such as improper mapping in electronic health records (EHRs) and laboratory information systems had a cascading effect and can pass through technical filters and go undetected till use of data by epidemiologists. Some DQ dimensions such as accuracy, relevancy, value-added data and interpretability are more dependent on users at either end of the data exchange spectrum, the relevant clinical groups and the public health program professionals. The study revealed that data quality is dynamic and on-going oversight is a combined effort by MEDSS Operations Team and Review by Technical and Public Health Program Professionals.
Conclusion: With increasing electronic reporting to public health, there is a need to understand the current processes for electronic exchange and their impact on quality of data. This study focused on electronic laboratory reporting to public health and analyzed both on-boarding and internal data exchange processes. Insights gathered from this research can be applied to other public health reporting currently (e.g. immunizations) and will be valuable in planning for electronic case reporting in near future.