Anda di halaman 1dari 29

Running head: EVALUATION REPORT 1

Course Entry Training Module Effectiveness Testing

Sarah Barnhart

California State University, Monterey Bay

July 23, 2019

IST622 Assessment and Evaluation

Dr. Bude Su
EVALUATION REPORT 2

Table of Contents

Section I - Introduction .................................................................................................... 4

Section II – Methodology.................................................................................................. 4

Expected outcomes ................................................................................................. 4

Learners................................................................................................................... 5

Prototype Description ............................................................................................. 5

Process .................................................................................................................... 8

Tryout Conditions ................................................................................................... 8

Pre-Test and Post-Test ................................................................................ 8

Usability Survey.......................................................................................... 8

Observation. ................................................................................................ 9

Section III – Results .......................................................................................................... 9

Entry Conditions ..................................................................................................... 9

Instruction ............................................................................................................... 9

Outcomes .............................................................................................................. 11

Summary of collected data.................................................................................... 11

Recommendations ................................................................................................. 13

Section IV – Summary .................................................................................................... 15

Appendices ....................................................................................................................... 17

Appendix A. Pre- and post-test Question #1 on Course Entry Comfort Level .... 17
EVALUATION REPORT 3

Appendix B. Optional Navigational Tutorial Button ............................................ 17

Appendix C. Invitation Email Sent to Participants ............................................... 18

Appendix D. Scheduled Observations .................................................................. 18

Appendix E. User Testing Website ....................................................................... 19

Appendix F. Pre and Post Test Questions ............................................................. 20

Appendix G. User Survey ..................................................................................... 25

Appendix H. Observation Checklist ..................................................................... 28

Appendix I. Descriptive Statistics .........................................................................29


EVALUATION REPORT 4

Course Entry Training Module Effectiveness Testing

Section I - Introduction

The data results in this report are taken from test users who have participated in module 3

of an overall capstone project. The name of this module is “Steps to Enter Courses in Oasis.”

The module teaches new employees how to enter coursework from a transcript into the Oasis

Peoplesoft system that is used in the Office of Admissions at California State University,

Monterey Bay. The prototype was created using Adobe Captivate and involves various user

interactions, such as a quiz, a drag and drop activity, a type-in-the-answer question activity, and

a simulated course entry environment. The collection of data was taken from pre- and post-tests,

a user survey, and seven individual observations. The results measure user learning gains and the

overall usability and effectiveness of the course.

Section II – Methodology

Expected outcomes

Upon completion of this prototype, learners will be able to enter coursework successfully

and independently without error. Additionally, the following learning objectives are expected:

 Learners will be able to distinguish a district transcript from a non-district transcript.

 Learners will be able to apply the proper default settings in the system.

 Learners will be able to identify course codes, grade codes, and units on a transcript.

 Learners will be able to determine which grades are acceptable in the system.

The hypothesis suggests that there will be a statistically significant increase in mean scores based

on the data results taken from the pre- and post-tests.


EVALUATION REPORT 5

Learners

The target audience for this testing group are new student assistants and employees who

do not already know how to enter coursework in Oasis. Because this is the third module in a 3-

part capstone project, at this point in the training the users will already have some basic

knowledge necessary to navigate through Oasis. For the purpose of this evaluation, all

participants were specifically chosen due to their roles in the Admissions office. There were 7

total participants, including 5 staff members and 2 student assistants. All participants have never

entered courses in the newest version of Oasis prior to their interaction with this training.

However, some of the users have been in the Office of Admissions for a significant number of

years and may have seen or done course entry on a previous system.

Because of this history, they were each asked to determine their comfort level with

course entry on a 5-point scale. The results indicate that 5 out of 7 users were “not comfortable”

entering courses in Oasis prior to taking the training; the additional 2 users were only “somewhat

comfortable.” The pre-test indicates that 5 users were initially “not comfortable” entering

courses in Oasis, and 2 users were “somewhat comfortable.” The post-test indicates 3 users

improved their comfort level to “very comfortable,” 2 users improved their comfort level to

“somewhat comfortable,” and 1 user remained the same at “somewhat comfortable” (Appendix

A).

Prototype Description

The prototype used in this evaluation is the third part of a 3-part training designed for

new student assistants that help with back-office transcript processing. The prototype is a module

that teachers users how to enter coursework in Oasis, which is the PeopleSoft system used in the

Admission’s Office at CSUMB. The course was designed using Adobe Captivate and contains a
EVALUATION REPORT 6

series of user interactions. All interactions contain friendly feedback. Some interactions contain

helpful hints in areas where users may need help with how they should to proceed.

The prototype was delivered as a link from a website for easy accessibility and contains

the following elements:

 Leaning objectives

 An admissions processing overview video

 Explanation summaries on how to read a transcript, including 2 district transcript

examples (Table 1.1)

 A 3-question quiz

 An “Apply Defaults” demo video (Table 1.2)

 A type-in-the-defaults knowledge check (Table 1.3)

 A course entry demo video

 An explanation of each course entry field designed as a click-through carousel (Table

1.4)

 A drag-and-drop knowledge check (Table 1.5)

 A final course entry user simulation

The initial Admissions Processing Overview video is meant to gain learner interest and

understand the importance of their role with processing documents. There is an optional

navigation tutorial so users can learn the buttons and features embedded in the training at the

beginning of the course if they choose (Appendix B). The prototype also contains optional

narration and closed captions. The images within the training prototype were all captured using

the internal system test environment. The course is designed to learn the various aspects of

course entry in micro-parts.


EVALUATION REPORT 7

By breaking up the different elements involved in entering courses into the system, the

learner has the opportunity become comfortable with reading the transcript prior to working with

the system. Please see tables 1.1 through 1.5 below which shows a few selected lessons and

knowledge checks within the design of the course.

Table 1.1 Table 1.2

Table 1.3 Table 1.4

Table 1.5
EVALUATION REPORT 8

Process

The users were selected based on current course entry knowledge and familiarity with the

Admissions Office document processing and use of Oasis. All users received an email (Appendix

C) that was drafted for approval by both office managers in the Office of Admissions. For

managerial support and to increase user participation, the email was sent directly from each

user’s manager. Originally 6 users were asked to participate, but a seventh user joined after

learning that she had not yet learned course entry. All participants provided their availability and

chose their timeslots for user testing (Appendix D). The users were each provided the invitation

link to the website at the time of their scheduled user testing.

Tryout Conditions

The website provided directions by listing 4 steps that users followed during testing

(Appendix E). Four users completed their user testing from their workstations. Three users

completed their user testing from a laptop in a conference room due to the potential for

interferences. For audibility, users listened with headphones or speakers to complete the training.

Once the user opened the website, they were first asked to complete and submit the pre-test.

Next, the user was asked to complete the training. Afterwards, the user was asked to complete

the post-test. Last, the user was asked to complete the user survey. The data from the pre-test,

post-test, and usability survey were all collected in individual Google Forms.

Pre-Test and Post-Test. The pre-test and post-tests (Appendix F) are identical and

consists of 13 questions that asks the user very specific and detailed questions about the course

entry process.

Usability Survey. The user took the usability survey (Appendix G) as the last step in the

user testing. This survey includes 9 total questions. Three questions are multiple choice. Three
EVALUATION REPORT 9

questions asks the user to rate their answer on a 5-point scale. Three questions were long-answer

questions where the user was asked to provide feedback.

Observation. The observation checklist (Appendix H) was used during the time of each

observation. Seven checklists were printed and completed for each individually scheduled user

testing. Various observations were noted and made from user responses. Every notable course

interaction was assessed at the time of the observation for future usability improvements.

Section III – Results

Entry Conditions

All users were entering coursework on the newest system for the first time and all seven

testers are experienced with the Oasis system. All users also have the basic computer skills

required to navigate the training module. There was no difficulty with navigating through the

steps in the website to complete the pre- and post-tests and user survey. One significant

observation made is that only two users played the navigation tutorial; both users were primary

target audience student assistant employees. Additionally, the maximum time allocated to each

user test was 40 minutes and all users took approximately 35-40 minutes to complete all aspects

of the user testing as expected, including the pre- and post-test, the module training, and the user

survey.

Instruction

All but one user answered all three formative evaluation questions correctly, which were

delivered in the form of a quiz format containing 3 questions. One user struggled with

understanding the wordage in the second question and asked for clarification. All 7 users scored

the difficulty level as “just right.” Three users skipped the carousel feature (please refer to table
EVALUATION REPORT 10

1.4 above), which consists of important information about each course entry field. All seven

users required intervention due to the drag and drop knowledge check slide containing a

“submit” button that was not functioning properly and would not allow them to move forward.

Despite some usability challenges, 4 users rated their experience at 4-points, and 3 users rated

their experience at 5-points (Table 2.1), with 5 points being the most impactful (table 2.2).

Table 2.1

Table 2.2
EVALUATION REPORT 11

Outcomes

The observations allowed for important takeaways from user interactions. Learners

understood how to navigate through the module but had difficulty with some features. Most

users stated that they liked the visual aids and the different practice opportunities. Only 1 user

stated that there was never any frustrations in using the training. Most users had some difficulty

with the features in the training. One user stated, “I sometimes found myself lost on how to click

within the training.” Another user had a similar experience and stated, “Instruction on moving

forward to next segment was confusing.” Thus, changes to the module will need to be made prior

to the final capstone delivery.

A request for “brutal honesty” from the user survey was emphasized and all users gave

quality feedback and suggestions for improvements. Two users suggested offering more course

entry examples. Three users suggested improving the user functions. One user stated a desire for

a “more clear indication of what to select to move forward.” Another user stated, “I think

question/clarification boxes might be helpful for clicking issues.”

Summary of collected data

Despite some usability challenges, all users scored higher on the post-test than on the pre-

test (table 3.3). The pre- and post-test scores were obtained manually by comparing individual

user scores from both tests. Since users took the tests in Google forms, the results were retrieved

by extracting all results from the pre- and post-tests in a Google sheet. One point was granted for

each correct answer on the pre-test and each correct answer on the post-test. Once the scores

were totaled manually, they were placed in an excel file (Table 3.1) and compared for statistical

significance. In total, only 12 of the 13 questions were graded and included in the learners’

scores. The first question referred to course entry comfort level and was not relevant to
EVALUATION REPORT 12

measuring the effects of learning transfer. The results were meant to determine if the lesson had

any effect on the user’s understanding of how to enter coursework correctly in Oasis.

The post-test mean value of 7.714 was significantly higher than the pre-test mean value

of 4.142 (Appendix I). To measure results, a paired two sample t-test was run for dependent

samples with 6 degrees of freedom (table 3.2). The data provides evidence for learning transfer

and proves the effectiveness that the course lesson has on the learner. Since the hypothesis is

directional, the one-tail values represent the true values for comparison. The t-stat value of 7.42

is much larger than the t-critical value of 1.94. In addition, the p-value of 0.00015 is much

smaller than the standard 0.05 alpha level. Therefore the null-hypothesis is rejected. In

conclusion, the results are statistically significant.

In addition, the effect size was calculated to determine if the results were practically

significant. The pre-test experimental scores prior to training (M=4.14, SD=1.21) and the

observed post-test scores after training (M=7.71, SD=1.25) differed significantly [t(7)=2.84,

p<.05 and d=2.85]. The effect size of 2.85 is much larger than the standard effect size of 0.8;

thus, the observed difference between the two tests are practically significant and the training did

have a significant effect on the learner.

Pre-Test Score Post-Test Score


6 10
5 8
2 8
4 8
4 7
4 7
4 6
Table 3.1
EVALUATION REPORT 13

t-Test: Paired Two Sample for Means

Variable 1 Variable 2
Mean 4.142857143 7.714285714
Variance 1.476190476 1.571428571
Observations 7 7
Pearson Correlation 0.46897905
Hypothesized Mean Difference 0
df 6
-
t Stat 7.426106572
P(T<=t) one-tail 0.000153396
t Critical one-tail 1.943180281
P(T<=t) two-tail 0.000306792
t Critical two-tail 2.446911851
Table 3.2

Table 3.3

Recommendations

Base on the observations, user survey, and direct in-person user feedback, the following

recommendations are suggested:

1. Provide an explanation for when the user should click on the informational icon.
EVALUATION REPORT 14

2. Make the embedded video volume higher so it is equivalent to the sound level of

the narration.

3. Include more schools and examples of district transcripts.

4. Include more knowledge checks and activities.

5. Make the instructions clear for the user to know when to move forward in to the

next slide.

6. Per user feedback in regards to the agent, something to consider is “to have a pair

of people with contrasting voices,” so that the same consistent voice does not

become draining or overwhelming.

7. In the simulation, make the tab key the feature that advances the user to the next

field, rather than the shift key, since the tab key is what is used in the actual

course entry process, and explained in the demo video.

8. Fix the drag and drop feature so that the “3.00” unit option enters into the correct

spot upon its release of dropping it over the correct field.

9. Correct the “submit” button in the drag and drop knowledge check activity. It is

currently not functioning properly.

10. Make the informational carousel stop at the last information piece. The repeat

feature is confusing to the user and does not make it clear when to move forward.

11. Remove the forward option in the informational carousel slide so that the user

will not skip past the important pieces of information.

12. Add highlights to each field in the informational carousel so that users understand

which field the information is referencing.


EVALUATION REPORT 15

13. Correct the second “click here” button in the district transcripts examples slide so

that the button does not change colors if the user clicks outside of the button

feature.

14. Remove the exit option until the end of the training

15. Make it clear when the user is done with the training and can close out.

16. Remove the duplicate audio that plays over again when the user finishes watching

the first district example video and is automatically returned to click on the

second district example video.

17. Change the feedback features in the simulation so that the user is prompted where

to click if they click anywhere outside of the intended target.

18. Correct the fill-in-the-blank knowledge check activity so that the user is prompted

where to click next. Possibly add highlight features to each field.

19. Change the drag and drop option “3” to a 3-digit course number to make it more

distinguishable from the number “3.00” units and the number “2” data row drag-

ables.

Section IV – Summary

Despite the users having difficulty with some of the features, every user gained a better

understanding of how to enter courses in Oasis. The learning gains are undeniable based on the

supported evaluation and statistical evidence. However, only 3 users said that they really enjoyed

the whole training; whereas, 4 users said that only some of the activities were enjoyable. Much

of the frustration seems to lie in the difficulty learners had with the module’s functionality.

Based on the overall reports, it is recommended that all inactive user functions are

corrected, the ease of use is addressed, and the instructions are extremely clear. The ability to
EVALUATION REPORT 16

watch the user was very insightful and allowed for the opportunity to catch malfunctioning

features that the user would not have caught themselves. The entire process was informative and

will ultimately improve the course design. It is clear that the course entry module is not complete

but has potential to be extremely valuable in the Office of Admissions as the very first computer-

based processing training.


EVALUATION REPORT 17

Appendices

Appendix A. Pre- and post-test Question #1 on Course Entry Comfort Level

Appendix B. Optional Navigational Tutorial Button


EVALUATION REPORT 18

Appendix C. Invitation Email Sent to Participants

Appendix D. Scheduled Observations


EVALUATION REPORT 19

Appendix E. User Testing Website


EVALUATION REPORT 20

Appendix F. Pre and Post Test Questions


EVALUATION REPORT 21
EVALUATION REPORT 22
EVALUATION REPORT 23
EVALUATION REPORT 24
EVALUATION REPORT 25

Appendix G. User Survey


EVALUATION REPORT 26
EVALUATION REPORT 27
EVALUATION REPORT 28

Appendix H. Observation Checklist


EVALUATION REPORT 29

Appendix I. Descriptive Statistics

Pre-Test Score Post-Test Score

Mean 4.142857143 Mean 7.714285714


Standard Error 0.459221465 Standard Error 0.473803541
Median 4 Median 8
Mode 4 Mode 8
Standard Standard
Deviation 1.214985793 Deviation 1.253566341
Sample Variance 1.476190476 Sample Variance 1.571428571
Kurtosis 1.778772112 Kurtosis 1.492561983
-
Skewness 0.366392178 Skewness 0.739707742
Range 4 Range 4
Minimum 2 Minimum 6
Maximum 6 Maximum 10
Sum 29 Sum 54
Count 7 Count 7

Anda mungkin juga menyukai