= Emerging. More than 5 years before clinical availability. (26.55%)
= Expected to be clinically available in 1 to 4 years. (39.66%)
= Clinically available now. (33.79%)
MSACL 2020 US : Orton

MSACL 2020 US Abstract

Topic: Data Science

Podium Presentation in Room 1 on Thursday at 16:10 (Chair: Shannon Haymond / Judy Stone)

No Middleware? No Problem. Using R and Shiny for Routine Review of QC Data and Other Quality Metrics

Dennis Orton (Presenter)
Alberta Precision Laboratories

Presenter Bio(s): Dennis works as a Clinical Biochemist, overseeing the Mass Spectrometry testing laboratory in Calgary for Alberta Precision Labs and has a cross appointment as a Clinical Assistant Professor at the University of Calgary. His research interests include lab assay design targeting drug metabolism and pharmacokinetics.

Authors: Dennis Orton (1)
(1) Alberta Precision Labs, Calgary, AB, Canada (2) University of Calgary, Calgary, AB, Canada


Introduction: Review of quality control (QC) data in the clinical lab generally utilizes vendor-specific and costly middleware systems which may not be user friendly or display all desired information. Additionally, many instruments may not come with a middleware option, the middleware may be cost-prohibitive, or it may not allow off-site data review, making routine QC review fairly labour intensive or time-consuming. These issues are especially problematic in regions with de-centralized clinical testing networks or with multiple analyzer vendors, and review of QC data is often limited to rudimentary Laboratory Information System (LIS) functionality, which is generally not user friendly or intuitive to use.

Objectives: Design a user interface to allow streamlined QC data review and allow rapid multi-site and/or multi-analyzer QC comparisons using R.

Methods: This script employs R (version 3.6.1) and RStudio (version 1.2.1335) with packages shinydashboard, ggplot2, and tidyverse to visualize QC data with filters for date range, assay type, QC product, analyzer, and QC lot number. The data is obtained from an automated download containing all QC data in the LIS over a 24 hour period with a sample identifier, test name, verified date, analyzer result, expected mean and standard deviation, QC product as well as the QC lot number. The data is saved to a shared network drive with access restricted to regional supervisory and technical staff.

The R script is set to automatically import the previous 30 days of QC data and displays the running mean and standard deviation for each test using the applied filters. Using shinydashboard format, more or less data can be viewed by importing more data files or by applying date range filters. Optional filters include test name, analyzer name/type, and QC material, which allow users to assess assay performance down to the instrument level. Multiple tabs are provided to display data in tabular or graphical format, with options for data to be summarize by day, week, or month.

Results: This dashboard provides a method for streamlining QC data review from various analyzer types and vendors, across sites and lot numbers, all of which can be viewed remotely. This provides technical staff the opportunity to quickly get through monthly QC review, as well as identify analyzers which may be seeing shifts in QC running means between analyzers or lot numbers. Ready access to this data allows staff to get through routine QC review quickly, while also promoting better region-wide lab quality and inter-site continuity. Adaptation of this dashboard could also allow review of other quality metrics such as patient running means or hemolysis rates, provided this data is captured in the LIS and access to the raw data is available.

Conclusion: This is a simple, customizable tool that is able to compile QC data for review without the need for investment in expensive or complicated middleware products.

Financial Disclosure

Board Memberno
IP Royaltyno

Planning to mention or discuss specific products or technology of the company(ies) listed above: