Abstract
Continuous quality improvement initiatives (CQII) in home visiting programs have traditionally occurred within a local implementing agency (LIA), parent organization, or funding provision. In Missouri, certain LIAs participate in the Missouri Maternal, Infant, and Early Childhood Home Visiting program (MIECHV). Their CQII activities and the coordination of CQI efforts across agencies are limited to quarterly meetings to discuss barriers to service delivery and newsletters. Their designed CQI process does not include evaluation of program fidelity or assessment nor supports to assist with identifying and prioritizing areas where improvement is needed. Therefore, much of LIA CQII are often lost to the benefit of external agencies facing similar challenges. We developed a virtual environment, the Missouri MIECHV Gateway, for CQII activities. The Gateway promotes and supports quality improvement for LIAs while aligning stakeholders from seven home visiting LIAs. Development of the Gateway environment aims to complement the existing MIECHV CQI framework by: 1) adding CQI elements that are missing or ineffective, 2) adding elements for CQI identification and program evaluation, and 3) offering LIAs a network to share CQI experiences and collaborate at a distance. This web-based environment allows LIA personnel to identify program activities in need of quality improvement, and guides the planning, implementation, and evaluation of CQII. In addition, the Gateway standardizes quality improvement training, collates overlapping resources, and supports knowledge translation, thus aimed to improve capacity for measurable change in organizational initiatives. This interactive web-based portal provides the infrastructure to virtually connect and engage LIAs in CQI and stimulate sharing of ideas and best practices. This article describes the characteristics, development, build, and launch of this quality improvement practice exchange virtual environment and present results of three usability pilot tests and the site launch. Briefly, prior to deployment to 58 users, usability pilot testing of the site occurred in three stages, to three defined groups. Pilot testing results were overall positive, desirable, and vital to improving the site prior to the full-launch. The majority of reviewers stated they would access and use the learning materials (87%), use the site for completing CQII (80%), and reported that the site will benefit their work teams in addressing agency challenges (66%). The majority of reviewers also approved of the developed fidelity assessment: as, easy to use (79%), having a clear purpose (86%), providing value in self-identification of CQII (75%), and recommendations were appropriate (79%). The System Usability Scale (SUS) score increased (10%) between pilot groups 2 and 3, with a mean SUS score of 71.6, above the U.S. average of 68. The site launched to 60 invited users; the majority (67%) adopted and used the site. Site stability was remarkable (6 total minutes of downtime). The site averaged 29 page views per day.