Anda di halaman 1dari 10

EVS Project: Methodology and Evaluation Strategy

Marija Cubric Amanda Jefferies


Business School School of Computer Science
www.herts.ac.uk  
Contents

•  EEVS  Project  Summary  


•  EEVS  Methodology  Summary  
–  Student  ques;onnaire  
–  Staff  ques;onnaire  
–  Student  blogs  
–  Staff  interviews  

 
EEVS Project Summary

The  background  to  EEVS  project:    


•  Evalua;ng  the  use  of  EVS  across  our  university  following  major  ins;tu;onal  investment  
by  the  university  in  2010-­‐2011  
•  JISC  Assessment  and  Feedback  Programme  Strand  B,  alongside  iTEAM  Strand  A  
•  Prior  work  since  2004,  now  moving  into  a  university-­‐wide  adop;on  
•  8  Academic  Schools  –  3,845  students  issued  with  a  personal  handset  

The  EEVS  project  objec3ves  are  …  


•  To  provide  an  up-­‐to-­‐date  view  of  the  student  and  staff  experiences  of  EVS  and  their  
opinions  of  what  makes  for  successful  use  of  the  technology,  within  a  large-­‐scale  project  
and  across  mul4ple  disciplines.    
•  To  iden;fy,  from  evalua;ng  the  staff  and  student  experiences,  a  set  of  cri4cal  success  
factors  for  introducing  and  maintaining  the  use  of  EVS  in  support  of  an  ins;tu;onal  
assessment  and  feedback  strategy.  

 
EEVS Methodology
Exploratory  pragma;c  approach:  
•  Gaining  understanding  of  the  meanings  from  various  stakeholders    
•  Focus  on  prac;cal  applied  research,  integra;ng  different  perspec;ves  to  help  interpre;ng  data  
(Sounders  et  al  2009)  
 
Mixed-­‐method  data  collec;on:  
•  Qualita;ve  data:  students’  blogs,  staff  interviews,  staff  reports    
•  Quan;ta;ve  data:  students’  ques;onnaire,  staff  ques;onnaire  (in  progress)  
 
Time  horizon:  Mainly  cross-­‐sec;onal  (a  ‘snapshot’  taken  in  2011/12)  with  some  elements  of  longitudinal  
study  (students’  blogs  over  a  period  of  2-­‐4  weeks).  

Sampling:    
•  Self-­‐selec;on  (voluntary  student  par;cipants  for  ques;onnaires)  
•  Self-­‐selec;on  &  purposive  (voluntary  student  par;cipants  for  blogs;  further  selec;on  made  to  enable  
balanced  representa;on  across  different  schools)  
•  Purposive  (heterogeneous):  staff  involved  with  or  leading  EVS  adop;on  in  different  Schools  
Student Questionnaire Design

Survey  launched  in  March  2012   Sec3on   Source  


  Background  Informa;on     Standard  demographic  ques;ons  
Includes  27  ques;ons  spread   Usage     (Venkatesh  et  al,  2003    
across  9  sec;ons  
Ease  of  use     Bangor  et  al  ,  2008  
Total  number  of  responses  N=   Impact  on  Learning   Draper,  XXXX  
590  (across  11  different  Schools)     Performance  Expectancy     Venkatesh  et  al,  2003  
  Engagement     JISC  Evalua;on  guidelines  
Response  rate:  TBC   Sa;sfac;on   JISC  Evalua;on  guidelines  
(based  on  total=TBC)   Opera;onal  procedures  
  iTEAM  ques;ons  
and  funding    
Data  analysis  with  SPSS  v9  incl.   Free  text  comments  
descrip;ve  sta;s;cs,  correla;on  
and  exploratory  factor  analysis  
 
 
Staff Questionnaire Design (TBC)

Survey  launched  in  June  2012   Sec3on   Source  


 
Includes  TBD  ques;ons  spread   Background  Informa;on     Standard  demographic  ques;ons  
across  TBD  sec;ons   TBD   TBD  
Free  text  comments  
Total  number  of  responses  N=  
TBD  (across  TBD  different  
Schools)    
 
Response  rate:  TBD  
(based  on  total=TBD)  
 
Data  analysis  with  SPSS  v9  incl.  
descrip;ve  sta;s;cs,  correla;on  
and  exploratory  factor  analysis  
 
 
Student blogs

Student  blogs  were  completed  in  a  variety  of  ways  according  to  individual  preferences  :  Word  Files  (46),  
online  blog  entries  (14),  Short  video  clips  (11),    Audio  files  (1)  and  submimed  via  a  restricted  access  group  
area  on  StudyNet  (UH  MLE)  
 
65  students  registered  for  the  web/  wrimen  blog  of  whom  27  were  chosen  to  ensure  a  spread  of  gender  
and  programme.  One  failed  to  complete  their  blog  
 
They  represented  7  Academic  Schools:  Business  (6),  Computer  Science  (7),  Humani;es  (6),  Law  (4),  
Educa;on  (1),  Sports  management  (1),  Mathema;cs  (1)  
 
Equal  male  and  female  numbers  applied  but  ra;os  varied  according  to  subjects  studied  15M:11F  
 
All  the  bloggers  had  previously  used  EVS  since  at  least  January  2011  
 
The  reflec;on  period  varied  between  2  and  4  weeks  in  the  period  Nov  11-­‐  Feb  12  
Qualitative analysis: Example

Summarise Categorise
Student   Quote   Theme  Instance   Theme  (sub-­‐category)   Category  
Text  func;on  would  be  useful  to  ask   Anonymous  text  ques;ons/ Poten;al  Use   Pedagogy  
ques;ons  anonymously   answers  
AJ4  
Instead  of  a  lecturer  asking  a  ques;on  and              
expec;ng  one  person  to  put  their  hand  up  
and  answer  it  may  use  the  wri;ng  func;on  
or  give  us  four  answers  and  let  everyone  
pick  what  answer  they  think.      
AJ6  
EVS  could  be  used  in  skills  sessions   Skills  sessions          
MC1  
MC1   Comparing  answers  with  friends  makes  it   Comparing  answers   Good  prac;ce  develops      
more  interes;ng   reciprocity  and  
coopera;on  among  
students,  
MC15   Changing  channels  needs  to  be  explained   Changing  channels   Training   Facilita;ng  
condi;ons  
Staff interviews (TBC)

Interviews  conducted  with  staff  across  the  university:    


•  Local  School  Champions  
•  Associate  Heads  and  Heads  of  School  
•  Lecturers  
•  Project  Director  
References
•  Bangor, A., Kortum, P., & Miller, J.A. (2008). The System Usability Scale (SUS): An Empirical Evaluation, International Journal of Human-Computer Interaction, 24
(6).

•  Chickering and Gamson http://www.uis.edu/liberalstudies/students/documents/sevenprinciples.pdf


•  D’Inverno, R., Davis, H., & White S. (2003), Using a personal response system for promoting student interaction, Teaching Mathematics and its Applications, Vol. 22
(No. 4): 163-169.

•  Draper, S. W., & Brown, M. I.(2002).Use of the PRS handsets at Glasgow University, Interim Evaluation Report: March 2002
http://www.psy.gla.ac.uk/~steve/evs/interim.html accessed April 2012

•  Draper, S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20(2), 81-94.
•  JISC (2004) http://www.jisc.ac.uk/media/documents/publications/effectivepracticeelearning.pdf

•  Kanter, R.M., Stein, B.A. & Jick, T.D. (1992) The Challenge of Organizational Change. Free Press. New York
•  Kennedy, G. E., & Cutts, Q. I. (2005). The association between students' use of an electronic voting system and their learning outcomes. Journal of Computer
Assisted Learning, 21(4), 260-268

•  Lorimer, J., & Hilliard, A. (2009). Use of a Electronic Voting System (EVS) to Facilitate Teaching and Assessment of Decision Making Skills in Undergraduate
Radiography Education. Paper presented at the Proceedings of 8th European Conference on e-Learning, Bari, Italy

•  Moore, G. A. (1991). Crossing the Chasm. New York: Harper Business.


•  Nicol, D., & Draper, S. (June 2009). A blueprint for transformational organisational change in higher education: REAP as a case study. In J. T. Mayes (Ed.),
Transforming Higher Education through Technology-Enhanced Learning.

•  Oliver, M. (2006). New pedagogies for e-learning? ALT-J: Research in Learning Technology, 14(2), 133 - 134.

•  Robins, K., (2011) EVS in the Business School, University of Hertfordshire Internal Report
•  Thornton, H. A. (2009). Undergraduate Physiotherapy students’ choice and use of technology in undertaking collaborative tasks. Open University, UK, Milton Keynes

•  Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. B. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27(3), 425–478.
•  Willis, J. (2009) Using EVS in the School of Life Sciences, University of Hertfordshire Internal Report