A Customized Assessment and Tevera Help The Chicago School Achieve CACREP Accreditation
The Chicago School addresses CACREP 2016 aligned program and SLOs with the DKSCA and Tevera
Back in 2014, one of The Chicago School of Professional Psychology’s (TCSPP) newest programs, the online campus for the M.A. in Clinical Mental Health Counseling (CMHC), had a total of two classes built. Enter Lori Soli, Ph.D. newly hired Director of Clinical Training, and the woman presented with the Herculean task of developing all remaining components of the CMHC program and getting it up and running.
Over that first year, she enlisted the assistance of two other faculty members, and the three of them labored over each new class, new hire, and new student recruited. All of that hard work finally paid off in January 2018 when the program received its eight-year CACREP accreditation.
Getting that accreditation was no small feat, but one accreditation issue in particular — that of finding the right assessment system for CMHC Online — proved uniquely challenging.
Using an Ethical Imperative As a Guide
Knowing the stakes were high — program administrators had resolved to submit their CACREP self-study by the time the first cohort graduated, so alumni could be grandfathered into an accredited program — Dr. Soli set aside her misgivings that assessment wasn’t really her forte and jumped in.
What she did know from her experience as a site supervisor in counseling practice and from her role in clinical training at previous institutions was just how crucial assessment is to maintaining a rigorous standard of care. “[I’d come] from a CACREP program at a great university, [so] the role of Gatekeeper was firmly instilled in me as a counselor educator. I’ve taken that ethical responsibility really seriously over the years, and have, at times run up against challenges in being able to show that someone really isn’t a good fit for the profession, because there wasn’t any data other than someone’s word to prove it or say why.”
With this in mind, Dr. Soli began researching. What she discovered in the counselor education literature was a consistent pattern of competencies a counselor needed to possess upon graduation. Initially she organized these competencies into “buckets of things.” Eventually, these characteristics crystallized into Dispositions, Skills, and Knowledge.
“…[I] have, at times, run up against challenges in being able to show that someone really isn’t a good fit for the profession, because there wasn’t any data other than someone’s word to prove it or say why.”
— Dr. Lori Soli
If It Doesn’t Exist, Build It Yourself
Now that Dr. Soli knew exactly what to assess, she just needed to find the right instrument to measure the characteristics she’d defined in her research. The catch? That assessment did not exist. Some instruments assessed clinical skills. Others measured dispositions. None ticked all the boxes.
Necessity being the mother of invention, Dr. Soli and her colleague, Dr. LoriAnn Stretch (Program Chair) took the information she’d compiled and put everything together in an original assessment called the Dispositions, Skills, and Knowledge Competencies Assessment (DSKCA).
“We decided to cross-reference everything to our program’s core values,” Dr. Soli said, “which meant we needed to base this instrument on the extant codes.” Consequently, the DSKCA cross-references to CACREP 2016 standards, the MCSJ 2016 Cultural Competency standards, and the ACA Code of Ethics, in addition to TCSPP’s own program-specific learning outcomes.
The Delivery Dilemma
Now that Dr. Soli had her assessment, one more challenge remained: determining the best way to deliver the assessment to support student growth, programmatic outcomes, and accreditation. Initially, The Chicago School used SurveyMonkey, because the DSKCA is essentially a survey.
However, SurveyMonkey had several downsides:
- SurveyMonkey is not FERPA-compliant, opening the school to the risk of potentially exposing sensitive student information obtained through the DSKCA.
- SurveyMonkey did not offer CMHC Online a way to easily compile and dynamically review the DSKCA results, a capability the school needed to make the best use of the DSKCA results.
- SurveyMonkey became yet another system for students, faculty, and site supervisors to figure out when and how to use.
With the downsides outweighing the benefits, Dr. Soli knew she needed to find another delivery method.
The Delivery Solution: Tevera
While Dr. Soli was researching and developing the DSKCA, she was also multitasking to bring The Chicago School’s CMHC Online program to life. Another decision she made during this process was to implement Tevera, practicum and internship software that makes it easier to manage placement sites, track student performance, and develop career skills in support of professional certifications and licensure.
As she learned more about Tevera — including its technology integration capabilities — housing the DSKCA within Tevera became the logical decision. “Our goal programmatically is to move everything we possibly can related to student assessment and clinical development in Tevera,” Dr. Soli explained.
Benefits of an Integrated System
“The DSKCA is not a ‘behind-the-curtain’ instrument.” Dr. Soli explained, “It’s the tool around which our whole program is operationalized. The benchmark levels are clear to students.”
Faculty and advisors can share students’ DSKCA results with them using Tevera’s Share feature, which gives them permission to allow students to see the results. On the other hand, Tevera also gives faculty the capacity to withhold results if there is reason to do so.
“Developmentally, the school wants to see progress. Because each student is unique, their developmental trajectory will look different,” said Dr. Soli.
Tevera’s reporting tools help illustrate each student’s trajectory. They let program faculty and administration see cumulative scores by individual student, by cohort or by year. They are even able to review faculty’s use of the DSKCA to be able to coach them to properly assess competencies.
“This is where the integration comes in. Tevera captures so easily who’s being assessed, who they’re being assessed by, and when. When you’re pulling reports, all of that data is there at your fingertips…that’s amazing!”
— Dr. Lori Soli
Could the DSKCA be Implemented for Other Programs?
Dr. Soli and her colleague are currently in the process of publishing the instrument, and their hope is that other universities will be able to use it in support of CACREP while easily adapting it to their own program standards. For the DSKCA’s knowledge competencies rubric, in particular, she notes, programs could either use their program’s, and simply change the name, or completely exchange it for their own.
Such adaptations would be facilitated for programs utilizing Tevera, because “Tevera can build the technological pieces to allow what can be modified to be modified” while maintaining the integrity of the instrument. “That’s the huge benefit of having instruments like this in an integrated technology platform like Tevera.”
There are important things to consider when transitioning program assessment into an integrated technology platform like Tevera, though. Intentionality at every level, including buy-in from administration, is key. “It can’t be, ‘If you want to…’’ Dr. Soli stresses. Because integrating new software into a program always includes a learning curve, she advises having a “champion” with the capacity to oversee the transition and guide staff through Tevera’s onboarding and implementation process.
“Being intentional about how and when you roll things out really sets you up for success,” she observes. CMHC Online did a “titrated” rollout of the DSKCA, making sure that they considered what made most sense both in terms of scope and the point of a student’s journey at which to start delivering the DSKCA through Tevera. ”We felt pre-field/field work classes were most critical, because those are the times that if there are dispositional or skills concerns, we need to catch them.”
She also noted the importance of planning at what point to bring on different stakeholders. “As the Director of Clinical Training here, the most important thing was that my site supervisors didn’t run screaming from the room. So, I needed to make sure that, as the person overseeing Tevera, I had the time and energy to give to them if they had challenges with the technology.” This progressive approach allowed CMHC Online to successfully get to a point of delivering student assessment completely through Tevera.
Ultimately, Dr. Soli advises that, whether with the DSKCA or other assessment tools, programs make sure to train program stakeholders on what the different categories of assessment mean, so that a consistent, rigorous standard of care can be maintained and developed in students. She’s proud of CMHC Online’s proven track record of holding students to high standards and sees the DSKCA as a way to strengthen student competencies in the context of faculty-student relationships the same to any “on-ground” campus.
This is something Dr. Soli is particularly passionate about, as she sees online education as an issue of advocacy and equity. “Online education opens up worlds for people who would never have access otherwise, and that’s the most exciting part for me.”
See Tevera for Yourself
Words matter. But seeing really is believing.
We appreciate the support of leaders with vision like Dr. Soli. If you’re interested in learning how we can support your vision, please contact us today!