An Outcomes Data Analysis Study to Identify the Unique Characteristics of Several Educational Formats
Dustin Long, MS; Patrick Hayes, BA; Brian McGowan, PhD; Timothy Hayes, MD, PhD
AcademicCME, Wayne PA
Data
Goal
This storyboard presents data for a comparative outcomes analysis of several CME program
formats standardized by disease state and objectives.
The analysis of these data will arm the CME professional with a more rigorous, quantitative
basis for selecting the right format for the right audience for the right learning objectives.
These findings allow for a rare opportunity to apply evidence-based rationale to format
decision-making.
Demographics
Total Audience
Total Number of Certificates
Neuro
PCP
Other
Live
Satellite Symposia
at Neurology Meetings
Number
Percent
276*
224
81%
214
96%
8
3%
2
1%
Evaluation Data
Were Learning Objectives Met?
Content Usefulness
% Commercial Unbias
Background
A principle component of developing and implementing an effective CME program is the
initial choice and design of the educational format (e.g. live events, print pieces, online
programs, etc.).
There is limited general guidance within the CME literature. For example, blended formats
and serialized learning endeavors have been associated with increased learner engagement,
compared to episodic single-format educational programs.
However, little to no data are available comparing the effectiveness or roles of various CME
delivery formats. The CME professional is forced to make assumptions regarding the
effectiveness of educational formats for the targeted audience.
As a result, the CME community is left with little empirical evidence as to the quantitative,
learner-based consequences of any particular program design.
Print
Monographs
Number
12000**
348
162
38
148
4.8
4.6
100%
Percent
3%
47%
11%
43%
Panel Webcast
1-Hour On-Demand Webcast
Promoted with Online CME Distributors
Number
Percent
3151***
524
17%
45
9%
170
32%
309
59%
Panel Webcast
1-Hour On-Demand Webcast
Promoted with Peer-Reviewed Journals
Number
Percent
2563***
492
19%
315
64%
59
12%
118
24%
Webseries
Four Part, 30-Minute On-Demand Webseries
Utilizing the Learning Actions Model
Number
Percent
1287
353
27%
155
44%
74
21%
124
35%
4.4
4.4
99%
4.5
4.5
100%
4.6
4.6
98%
4.5
4.5
100%
Post
Dif
Pre
Post
Dif
Pre
Post
Dif
Pre
Post
Dif
Pre
Post
Dif
LO 1 Knowledge
LO 2A Knowledge
Average
27%
40%
34%
78%
95%
87%
51%
55%
53%
48%
35%
42%
88%
80%
84%
40%
45%
43%
42%
50%
46%
88%
97%
93%
46%
47%
47%
45%
38%
42%
80%
90%
85%
35%
52%
44%
53%
50%
52%
90%
86%
88%
37%
36%
37%
87%
LO 2B Competence
LO 3 Competence
Average
21%
29%
25%
84%
88%
86%
63%
59%
61%
18%
18%
78%
78%
60%
60%
33%
19%
26%
87%
80%
84%
54%
61%
58%
48%
45%
47%
94%
86%
90%
46%
41%
44%
44%
43%
44%
86%
84%
85%
42%
41%
42%
85%
Methods
AcademicCME has embarked on a cross-format, learner outcomes analysis within the field
of multiple sclerosis. This outcomes analysis comprises of seven multiple sclerosis
programs, initiated in the fall of 2013.
The educational designs examined by this study include: two live events at state medical
societies, two print monographs, two 1-hour webcasts, and one 4-part, 30-minute, ondemand webseries leveraging the learning actions model. These CME programs are distinct
from each other; however, due to following a similar learning objective, agenda, and
outcomes format, they are deemed comparable.
Program outcomes data is assessed from a series of pre/post-test questions, aimed at
addressing the identified learning objectives. Each learning objective is addressed by
multiple questions. The correct response rate for all questions for a specific learning
objective was averaged in order to achieve a single percentage correct per learning
objective (pre- and post-test).
All program evaluation data is learner-reported (e.g. content usefulness) and scored on a
scale from 1 to 5 with 1 being strongly disagree and 5 being strongly agree.
For this study, our learning objectives (LOs) followed a theme as follows: LO 1 Knowledge,
LO 2A Knowledge, LO 2B Competence, LO 3 Competence.
Increasing our sample size, both in terms of the number and variety of program designs,
including slides with audio and shorter webcast modules, for example.
Tracking outcomes over time within a defined clinician cohort.
Expanding this investigation to other disease states, clinician specialties, and learner
demographics.
Developing more accurate methods for the determination of learner engagement, as
certificate completions can be an inexact metric of clinician learning.
100%
90%
90%
80%
80%
70%
70%
60%
60%
50%
50%
40%
40%
30%
30%
20%
20%
10%
10%
0%
0%
Print
Live Symposia
Specialists were overwhelmingly the most represented audience.
Average knowledge and competence improvements were high. However, within the live format, the pre/post-tests are in print form, with no
enforceable requirement to complete the pretest prior to content presentation. Therefore, it is possible that learners could complete the pre-test
without giving it appropriate attention. This has the potential to affect the differential between the pre- and post-test.
Print
Attracted a diverse, but specialist-centric audience.
In mailed print pieces, it is difficult to accurately assess the number of learners who read the monograph in its entirety.
Average knowledge and competence improvements were high.
The pre/post-test setup is similar to that of the live format, potentially causing an inflated measure of improvement from the pre- to the post-test.
Webcast Promoted with online CME Distributors
Pretest competence was low, likely due to a mostly non-neurologist audience.
Regardless, the AcademicCMEs unique faculty panel format was effective resulting in a high post-test levels, demonstrating the success for this
format even without specialty education.
Live
Knowledge
Competence
90%
80%
Discussion
70%
60%
50%
40%
30%
20%
10%
0%
Live
Findings
Live
Future Work
1941
891
349
701
Pre
Cumulative
AcademicCME had the privilege to deliver multiple CME activities updating the clinicians regarding the management of patients with
multiple sclerosis. We applied our unique three-pronged approach, including a review of available evidence-based medicine, clinical
trial analysis, and application to practice, in five different educational formats. These data begin to shed light on format-specific trends
and highlight specific attributes for each.
With our method of content development, we produced an average pre/post-test differential of over 80% in both knowledge and
competence, regardless of format.
Learner-reported evaluations show that objective satisfaction, content usefulness, and commercial bias was similar and positive across
all formats, suggesting that these are not confounding factors.
Through these activities, AcademicCME was able to educate an estimated 8,000-9,000 readers/learners, and offer 1,941 CME
certificates. With respect to neurology specialists, AcademicCME issued 891 neurology certificates. It should be noted that certificate
completions are an inexact method of gauging learner engagement, and likely represent an underestimation of learners. From our
evaluation data, our neurology learners see approximately 34 patients per month with multiple sclerosis. These seven CME activities in
five formats during the fall of 2013 had a potential clinical impact of 30,294 multiple sclerosis patient visits per month.