(Last Updated On: January 31, 2014)

by Dianne M. Cearlock, PhD, Chief Executive Officer

A great assessment plan is the program director’s best marketing tool. Assessment may seem like just another responsibility of meeting NAACLS Standards, and that is true, but the savvy program director uses the assessment plan for so much more. The intended use of assessment is continuous quality improvement of the program but a great assessment plan also feeds directly into a successful marketing strategy for the program.

The 2012 NAACLS Standards require all programs to have a systematic assessment plan that measures the effectiveness of the program and uses the findings for continuous refinement of the program’s curriculum, education delivery methods, and other processes. Several quantitative outcome measures must be included in the assessment plan including certification/licensure pass rates, attrition, graduation, and pass rates. The use of other measures such as the results of capstone projects, faculty feedback, exit or final examinations, exit interviews with graduates, student and graduate professional leadership, impact of the program on local and regional healthcare, etc., are optional (Standard 2, I, a-c). Programs should align the assessment plan and outcomes measures used with the institutional and program missions. In other words, the assessment plan should be an ongoing process to document that programs are fulfilling their missions and that of the sponsoring institution and, if not, to make changes to the program consistent with achieving program and institutional missions.

NAACLS established benchmarks for several quantitative outcome measures including certification pass rates of ≥75% on BOC examinations, and graduate and placement rates of ≥70%. All of these are reported annually using a rolling 3-year average and, when that does not occur, further analyses of the program may be required (Standards Compliance Guide). But it is vitally important not to let these quantitative measures control the “story” of a program. A program, its students, and its faculty are so much more than the sum of its numbers. And as important and compelling as certification/licensure pass rates are to those of us in the field, is recitation of those statistics, no matter how wonderful a program’s graduates and pass rates are, really “sexy” to Chairs, Deans, Presidents, and CEOs? It’s doubtful. And that is how a great assessment program, linked to the institution and program missions, and open to innovation by program officials, connects with marketing.

NAACLS programs vary greatly in terms of missions, models, education delivery styles, student populations, and communities. The NAACLS Board of Directors is well aware of that and strives to make decisions that ensure the public of high quality programs and competent graduates while allowing the maximum amount of freedom to programs to grow and thrive in ways suitable to their sponsors, missions and communities. Urban, rural, academic, clinical, military, and other environments have their own priorities and personalities. And the NAACLS’ requirement for assessment plans allows (indeed, invites!) program officials to tailor the plans to the particular program’s needs and unique qualities. If program officials take advantage of this opportunity, then the findings from assessment provide “at-your-fingertips” data and the tools needed to tell the program’s “story” to different audiences. To current or potential students, certification and licensure pass rates are critically important. To graduates and their loved ones, employment placement rates are everything. To the community, the number and competency of a program’s graduates is critically important to hiring laboratory staff and providing healthcare services. To Chairs, Deans, and Presidents, enrollments, attrition, and the building of strategic partnerships with community businesses are compelling. And to industry CEOs, graduates who have achieved leadership roles in businesses and communities and support their alma maters have great influence on decision-making.

If an assessment plan garners information about all these various outcomes, findings from that assessment plan allow program officials to tell various audiences the “story” of their program, anytime, anywhere, with very little additional preparation. And story-telling should be done consistently, to various audiences, and for a multitude of reasons. There is no “off” season for story-telling. The ways that programs are meeting (and exceeding?) their institutional and program missions should be constantly shared with as many stakeholders as possible. Through websites, brochures, press releases, presentations to school or community groups, participation in career and job fairs, memberships on the advisory committees for other programs or institutions – any and all opportunities to tell people about the wonderful programs and how they serve their sponsoring institutions and communities. It should never be assumed that people just know this – because they do not.

Every program has its unique character and what works best for one program may not be the best ploy for another. But there are a few outcomes that frequently seem to be very effective and program officials are wise to assess these outcomes, build a database, make alterations as needed to improve performance or align more closely with the mission, and be ready to spread the word at the drop of a hat.

  • Student Generated Hours – Anything that increases these numbers is generally good for marketing purposes. A number of strategies may be used including active recruitment of students, bridge programs allowing lab professionals to further develop their careers, creating courses of interest to a large student community (not just those in one’s program), developing online courses, and more. There are limits to how many students faculty can serve, but it pays to think creatively about how limited faculty can work “smarter” not “harder”. The greater the student generated hours compared to the number of faculty, the less expensive a program appears on the balance sheet.
  • Employment Data – These data usually work in favor of the program. Unemployment of competent laboratory professionals who wish to work is not common. But go beyond merely keeping track of the number of graduates placed in employment or further education (as NAACLS requires). Obtain from area or regional clinical sites information about the percentages of laboratory staff that graduated from the program. Have any of those graduates risen to positions of leadership or somehow made significant contributions to their employers? The public wants to see accountability from educational and clinical institutions and documenting that program graduates are gainfully employed is very convincing.
  • Connections, Community Impact and Clout – This is a more difficult outcome to address because it is people-oriented and not easy to capture using numbers. Laboratory professionals for years have tended to hide out down in the laboratory vault and calculate test results past the decimal point. But how does that help decision-makers know about how well a program meets the institutional mission? Or that a program is critical to the community? In general, any activity that gets program officials (and members of their Advisory Committees!) interacting with people outside of the program may be fruitful. Program officials should sit on committees and advisory boards, have coffee with laboratory managers, stay in touch with program graduates, and reach out to regional employers for possible partnership opportunities. Chair the junior high school career fair? Serve as the advisor for a student club? Volunteer at health screenings at area employers? Host an alumni night? Present a primer for interpretation of lab results at the local seniors’ community center? All of these are possibilities and there are so many more. All those little connections put program officials in touch with their communities and let them spread the word about the impact of the program on the region. And some of those connections may well have the ear of decision-makers. That’s clout. The old saying that “it’s not what you know but who you know” is more relevant than ever. The impact of a program, its faculty and its graduates on the community is a difficult outcome to “measure”, but it is probably the most important outcome of all to share. The ability to articulate how a program meets its mission through service to people regionally and its impact on the community is ultimately more convincing to decision-makers than the best of certification/licensure pass rates.

Assessment is the planning and implementation of a system, in alignment with the sponsor’s and program’s mission, for tracking program and graduate outcomes. Findings are then analyzed and used for continuous program improvement. Assessment findings should indicate whether or not a program is achieving its mission and, if not, point to alterations that may be made to improve performance.

Because each program is unique, assessment systems should also be unique such that the findings may be used to articulate the program’s success at achieving its mission and to “tell its story”. And with a story to tell, it is easy to talk to people about the program and to find the nuances in the story that most interest the particular listeners. Good marketing is a form of storytelling and the best and most interesting stories are those about people. So gather all those numerical outcomes data for the program but don’t forget to gather the data about the people of the program and their roles in their communities too!

Hit Enter
%d bloggers like this: