Most medical teaching programs require months of firsthand training in a health care setting. Scheduling rotations, tracking earned credits, monitoring patient procedures, and benchmarking assessments is typically done manually via paper-based systems. Such was the case at Southern California University of Health Sciences. A paper system initially tracked 150 interns working in 25 different locations. Progress toward graduation was recorded and monitored manually through paper transcripts, credits, and standardized test scores. The process required hours of manual data input.
Until 2005. That was the year program accreditors on a site visit requested data that the university was unable to provide in a timely manner. Finding documentation regarding student credits often meant searching through boxes of slips of paper.
Concerns about effectiveness and efficiency sparked a call for an improved system. Kevin Rose, coordinator of intern education, was asked to lead the project. SCU needed a better system for tracking intern educational requirements, as well as help dealing with the ever-increasing accreditation reporting demands.
The university had been using Microsoft Access for about three years to track student intern credits, so it was already loaded on campus computers. Expanding its use was a low-cost solution to serve as a proof of concept for a broader, more robust web application. After demonstrating the possible efficiency improvements with Access, Michael Sackett, dean of the Los Angeles College of Chiropractic, and other administrators asked the university’s IT department to create a database-driven web application: the Clinical Internship Portal.
The portal tracks credits and patient procedures students have completed, plus schedules clinical rotations (checking for conflicts) and stores student assessments. In the past, a staff member would spend 20 to 30 hours per month simply scheduling clinical rotations. The portal distributed that responsibility to clinical faculty, and that administrative time has dropped to about 20 minutes per month. As for the number of schedule conflicts and errors, they have dropped to near zero. And students can now log in to see which degree requirements they have already met and which they still need to complete, explains Sackett.
Beyond efficiency increases, this new automation has spurred unexpected improvements. While accreditation requirements demand that universities document what students have done, “We’re now doing a lot more tracking of conditions treated and procedures,” explains Rose. “We’re also doing more student evaluations, and they are available for all involved faculty to read.”
Sackett adds that the portal “has improved the whole training process.”