Anda di halaman 1dari 10

The Distance Learning Benchmarking Club Final Summary Report

Paul Bacsich. September 2011 Summary Spinning off from the DUCKLING project,1 a Distance Learning Benchmarking Club of four institutions (initially seven) was set up across the world. All four were successfully benchmarked using a slightly modified version of Pick&Mix. This short report describes the history and outcomes of the Club, within the wider context of developments in Pick&Mix and in particular the progress of benchmarking at two further institutions outside the Club. History Arising out of the planning for the DUCKLING project bid I was asked by Professor Gilly Salmon, then head of BDRA at the University of Leicester, in early 2009 to organise a small club of higher education institutions similar to the University of Leicester in their approach to distance learning, who would benchmark their e-learning in a collaborative way. The University would also be a member of the club. A preliminary project plan was prepared in June 2009 and it was agreed that via the consultancy company that I own (Matic Media Ltd) I would be paid a small amount (a fee of 2000) to organise the club, but that any methodology development and support of the individual institutions would be supplied free of charge by myself on the basis that it would assist my further development of the Pick&Mix methodology. It was further agreed that BDRA staff would contribute to the support and organisation of the Club on the basis of their expertise in Pick&Mix benchmarking, the University (via BDRA) being at the time one of the earliest and most knowledgeable users of Pick&Mix. No formal agreement was signed in some ways this was a problem but in others it was vital to the success, as it is likely that any detailed business analysis of the proposition would have led my colleagues in Matic Media Ltd to recommend against the proposal especially since the rate I charged the University of Leicester was about half of the commercial rate. As a matter of principle, I never accept any funding for the development of Pick&Mix (for many reasons including to protect the IPR and to continue to make it available via Creative Commons), but I am normally paid for its support. The rough tariff, dating from the HE Academy era of the Pilot Phase of benchmarking in the UK, is about 10 days work per institution. Try as one will, the tariff rarely comes out to much less than this in the end with institutions new to benchmarking. Since all the benchmarking I had done up till then was for institutions within the UK, neither the University nor myself had any experience of supporting benchmarking at a distance and the partners assumed that use of video conferencing tools would make sense (in fact the very first idea was to use GRID conferencing not just Elluminate or similar but in the end these were not used to any large extent. The original plan was that:

The DUCKLING project (but not the Club) was funded by JISC. See http://www.jisc.ac.uk/whatwedo/programmes/elearning/curriculumdelivery/duckling.aspx

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

A Distance Learning Benchmarking Club of seven universities across the world, all active in distance online learning in a dual-mode fashion, will benchmark their online distance learning activity twice in the next two years, once in Autumn 2009 and once in Autumn 2010. However, in only one institution did this pure paradigm occur of benchmarking twice most activity took place in the period September 2010 to February 2011. The benchmarking system was described as follows: The benchmarking system to be used is a new version of the Pick&Mix system used already by 24 institutions in the UK for benchmarking e-learning and currently in use by two more (another UK institution and an Australian institution). It has been slightly adapted to have more focus on distance elearning and serious implementation (step-change) but without going beyond the guidelines on numbers of criteria used. The basis for the new set of core criteria is the set of Critical Success Factors defined by the Re.ViCa project using extensive international input from a wide-ranging International Advisory Committee of e-learning experts usefully, there is a substantial overlap with the current UK set of core criteria for Pick&Mix... The Club and the Pick&Mix methodology draws on five phases of benchmarking with UK HEIs using the Pick&Mix system, and on wider experience of benchmarking in Europe and internationally (in particular, but not only, the Re.ViCa project). Early in the history of the Club there was already interest from open universities and additional dual-mode institutions in this club and requests to join. It was decided early on that the Club was a closed club but that I would take steps and soundings to see how to set up another club later. For that reason, the Club is more pedantically described as: The first dual-mode distance learning benchmarking club Plans are now being formed to set up a second benchmarking club involving some but not all of the institutions in the first Club, and some new members. Project partners Earlier scoping work on such a Club suggested that between 4 and 8 universities should take part at least 4 (to ensure sufficient diversity and common working) and at most 8 (to ensure viable operation of virtual project meetings using synchronous video tools). In general terms, even numbers are better than odd more suitable for subgroups but this is not essential. It was further agreed that: each partner should be a university with at least 10,000 students whose distance online learning offering has in excess of 1000 students and also has a wide range of programmes. The final list is of 7 institutions was stable by September 2009: 1. (lead) University of Leicester, UK 2. University of Liverpool, UK 3. University of Southern Queensland, Australia 4. Massey University, New Zealand 5. Thompson Rivers University, Canada

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

6. Lund University, Sweden 7. KTH (Royal Institute of Technology), Sweden. This was the list of universities reported when I presented an invited paper on benchmarking and quality to the ENQA invitational workshop on quality in e-learning in Sigtuna (Sweden) in October 2009.2 All of those I had met personally at conferences in 2009 and all had confirmed to me face to face and in writing (email) that they would join the Club and do benchmarking. The finalists However, despite the best endeavours of myself and Professor Gilly Salmon, we lost three institutions along the way. It is not the place of this paper to speculate on the motives and the changes behind this that is more the role of a competitor analyst but my summary observations would be as follows: The University of Liverpool, UK had seen some changes in personnel including a new VC and a growing and deepening relationship with Laureate perhaps they felt that benchmarking had no value since their path was fixed by higher powers. The University of Southern Queensland, Australia suffered the death of a key individual they took some time to re-engage but by that time they had recruited Professor Gilly Salmon and it seems that their priorities went off in a different direction. Massey University, New Zealand were initially enthusiastic, indeed from both their senior staff I met, but for whatever reason (perhaps changes of LMS and recently some reorganisations) they never got started. But interestingly in the last year they have collaborated well on other projects.

This left the final four, one in UK, two in Sweden, and one in Canada: 1. University of Leicester, UK due to some staff and internal issues they did not finish the benchmarking until late 2010 2. Thompson Rivers University, Canada they managed to do two phases and I managed to attend their final scoring meeting (and two of their team met me in London) 3. Lund University, Sweden they did a thorough job as one of three methodologies they used I attended the kick-off meeting 4. KTH (Royal Institute of Technology), Sweden they took longer to get going but in the end did a very thorough job I met their team three times, twice at KTH and once at London In line with earlier work for the Higher Education Academy, four is a sufficient basis on which to make useful comparisons. What is benchmarking distance learning? Across the world, by no means all distance learning is yet imbued with substantial amounts of e-learning. So in theory we had to decide between two options: 1. Benchmarking e-learning within the Distance Learning slice of the institution, and 2. Benchmarking distance learning, using a distance learning mood of Pick&Mix.
2

See the presentation linked to http://www.enqa.eu/eventitem.lasso?id=249

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

There are already moods of standard Pick&Mix (the so-called bis and ter forms) for general teaching/learning and IT respectively. There has been talk of a lambda mood for libraries and there is no conceptual difficulty with a delta mood for distance learning in fact a draft version was prepared, based on (but not just) replacing all uses of e-learning by distance learning. In fact in recent months even newer moods of Pick&Mix have appeared which are more like variant forks for FE, green IT, OER and employer engagement. All this activity (not controlled by me but usually done by people in association with me) I take as evidence of the health of the underlying methodology However, I stressed then and continue to stress now that maintaining compatibility with the standard form of Pick&Mix is important. This favours option 1. And in reality in the developed world (as we found in the Re.ViCa project) almost all distance learning has elearning aspects: in particular the institutions in our Club have enough online learning in their distance learning to make the distinction rather nebulous. However, one specific criterion, on strategy, needs interpretation differently (details later). Engagement approach Our original approach to engagement with institutions now with hindsight looks rather naive: The standard approach to benchmarking clubs used by Pick&Mix in the UK has been adapted to the global context, replacing the face to face meetings with teleconferencing sessions and taking into account the shorter timescale. (The standard UK approach in its most recent incarnation, in Wales, is described at http://elearning.heacademy.ac.uk/weblogs/gwella/?p=14.) The final scoring meeting in each institution will be remotely monitored by the central benchmarking team, again using teleconferencing methods. However, it still works well in many institutions. Those from a quality background will want to note that the type of engagement approach used in benchmarking is not quite the same as some institutions are used to from national quality bodies. The Pick&Mix version is called The Iterative Self-Review Process with use of expert moderators. It has the following features: It encourages a more senior level of participation from the institution: the result is theirs, not the moderators It allows them to get comfortable over time with the criteria as they apply to their institution It helps them move directly to implementation of change But it selects against complex methodologies not an issue with Pick&Mix as it is simple (compared with others) And requires more effort from moderators.

This aspect was gone into quite thoroughly in my ENQA presentation. Note that the methodologies such as Pick&Mix do not require this approach but it was what our UK institutions desired, for all the public criterion systems (not only for Pick&Mix) there was strong resistance to a documentary review approach. The sequence of meetings held at each institution should be the following: 1. Introductory meeting, followed by: Initial collection of evidence, and Selection of supplementary criteria
Bacsich 4 23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

2. Mid-process meeting, followed by: Further collection of evidence 3. Scoring rehearsal meeting, followed by: Final tweaks on and chasing of evidence 4. Scoring meeting 5. Reflection meeting to move to change. In a national situation with more support, the moderators would attend each meeting. In our distributed context this was always going to be hard, even virtually (there was no travel budget provided). But our aim was that: with careful planning and scheduling for phase 2... we should aim for the moderators to attend each introductory meeting, scoring meeting and reflection meeting. It rapidly became clear that the virtual moderation method did not work. With hindsight the reasons might have been obvious in many institutions whatever the initial commitment from senior managers (usually just one), the benchmarker has a sales job to do and this cannot be done virtually. Thus it very rapidly became clear that we needed a Plan B. Fortunately, using existing conference commitments, adding elements of local travel to international speaking invitations, and often relying on hospitality from host institutions, I managed to visit all the institutions at least once and several of them met with me when they travelled. Natural meeting points included Online Educa Berlin each year, the Learning and Technology World Forum in London in January 2010 (TRU) and BETT in January 2011 (KTH). I also visited Lund University and KTH before and just after the ENQA event in October 2009 and revisited KTH in 2010 when benchmarking elsewhere in Sweden. I was fortunate to visit TRU as a side trip from a working group meeting at Athabasca University in September 2010. I cannot recommend this approach as replicable even for myself again for reasons not entirely obvious quality was a hot topic in that era and I was well funded to go to other conferences as a result of EU and national projects (Becta especially). And it would not have worked for antipodean institutions (that is, Australia and New Zealand). In the way of events, I am now going to New Zealand for a period in 2012 but rather too late and not at the invitation of the institutions who were to be involved. Related analytic work Three pieces of concordance work were planned to be done. 1. A concordance will be generated between the Australian/New Zealand ACODE system and Pick&Mix. This is seen as important for institutions in that region. The concordance facilitates the use of a common evidence base for both ACODE and Pick&Mix benchmarking. 2. Earlier work on lessons to be learned from the UK QAA precepts in relation to e-learning (http://www.qaa.ac.uk/academicinfrastructure/CodeOfPractice/section2/appe ndix.asp) will be updated. This is seen as important for the UK and a current commercial client of Pick&Mix is keen on this also. This work will feed into work being done for the QA-QE SIG of the UK Higher Education Academy
Bacsich 5 23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

and in turn to the QAA. A similar correlation should be done, with the Swedish partners, on the correlation to the Swedish criteria for quality in e-learning. 3. Lund University is currently undertaking benchmarking using the Excellence system which is popular in certain EU circles. Again, earlier work on the Pick&Mix to E-xcellence concordance will be updated with the help of Lund University. In the end the work under #2 did not need to be done under this project it was subsumed into my work as one of those who fed information back into the QA-QE SIG critique of QAA work on quality of e-learning. I am not the only person who was disappointed when it became clear that the QAA almost completely ignored the input from QA-QE SIG. The work under #3 was excellently carried out by Ebba Ossiannilsson at Lund University. I developed the ACODE-Pick&Mix concordance but struggled fruitlessly to get the ACODE experts to comment on it. The new Core Criteria When the project started, a revision of Pick&Mix was considered. Pick&Mix has nearly 100 criteria now, but in the UK version only the first 20 are Core. Recent analysis had suggested that some of these UK core criteria were no longer so relevant and that others were less relevant for non-UK work. In addition, for institutions seriously engaged in e-learning, there was a different balance of considerations an issue also explored by Re.ViCa. Finally, recent developments in several countries on the Student Experience (Learner Voice) in e-learning had led to an increasing focus on student aspects. This led to the following set of Core Criteria being proposed for Club members. (Note that it was envisaged from the beginning that different applications areas would have different core criterion sets.) The 8 ones that are in the UK core are in bold.
Code 04 06 Criterion name Usability e-Learning Strategy Criterion level 5 statement All systems usable, with internal evidence to back this up. Regularly updated e-Learning Strategy, integrated with Learning and Teaching Strategy and all related strategies (e.g. Distance Learning, if relevant). Effective decision-making for e-learning projects across the whole institution, including variations when justified. All staff trained in VLE use, appropriate to job type and retrained when needed. A fit for purpose costing system is used in all departments for costs of e-learning. Integrated annual planning process for e-learning integrated with overall course planning. All staff engaged in the e-learning process have "nearby" fastresponse technical support. There is effective decision-making for e-learning programmes across the whole institution, including variations when justified.

07 10 12 13 16 19

Decisions on Projects Training Costs Planning Annually Technical Support to Staff Decisions on Programmes

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

Code 22 29

Criterion name Leadership in eLearning Management Style Relationship Management Upwards Reliability

Criterion level 5 statement The capability of leaders to make decisions regarding e-learning is fully developed at departmental and institutional level. The overall institutional management style is appropriate to manage its mix of educational and business activities The institution has effective processes designed to achieve high formal and informal credibility with relevant government and public agencies overseeing it. The e-learning system is as reliable as the main systems students and staff are used to from their wider experience as students and citizens, Market research done centrally and in or on behalf of all departments, and aware of e-learning aspects; updated annually or prior to major programme planning. A system where security breaches are known not to occur yet which allows staff and students to carry out their authorised duties easily and efficiently. Students have good understanding of the rules governing assignment submission, feedback, plagiarism, costs, attendance, etc and always act on them. Help Desk is deemed as best practice. Frequent (ideally annual) Student Satisfaction survey which explicitly addresses the main e-learning issues of relevance to students.

35

53

58

Market Research

60

Security Student Understanding of System Student Help Desk Student Satisfaction

91 92 94

Criterion 06 is paired with a doppelganger criterion 06d


06d Distance Learning Strategy Regularly updated Distance Learning Strategy, integrated with Learning and Teaching Strategy and all related strategies (e.g. eLearning, if relevant).

This makes 18 criteria in all. As usual, institutions that were new to Pick&Mix benchmarking were free to select up to 6 additional Supplementary Criteria in particular they could select all the UK core criteria (but see next paragraph). Not many selected any additional criteria. The Pick&Mix criteria 01, 02 and 03 are recommended for use as get you talking practice criteria: this is because they are simple to understand and a good introduction to the issues, even if not as relevant as they were a few years ago. This new core set has been used by other institutions now including Gotland University in Sweden and was the one offered to a post-2002 university about to start on benchmarking, the University of Northampton who over summer 2011 benchmarked their distance learning programmes in one School.

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

Related work and other issues Two other institutions With only four institutions that were benchmarked there were a number of situations where there was insufficient data from which to draw conclusions. Conveniently, two other commercial benchmarking activities were going on alongside the work, with Liverpool John Moores University in the UK and Gotland University in Sweden. Both managed successfully their benchmarking activities, LJMU using the classic Pick&Mix engagement approach, while Gotland piloted what is now called the lock-in mode. In its pure form this where the benchmarker calls together the senior management team and their key advisors over a 2-day period and no one leaves until the job is done. In reality it is not quite so easy but Gotland is a small institution and the Rector approved this approach, which worked well despite some people having to pop out for other meetings as expected. A variant of this, the benchmarker in residence was used at the University of Northampton this is where the benchmarker embeds themselves for a week or so in the institution and ensures that all key people see him or her. Both LJMU and Gotland have produced excellent private reports, with of course the Gotland one in Swedish. The BELEUSA project A bonus or a distraction depending on ones point of view was when I was approached in early 2011 by a professor from University of Wisconsin Extension to bring Pick&Mix into the US context. Cutting a long story short, an EU-US consortium developed a bid BELEUSA to integrate Pick&Mix with the US Quality Matters scheme into a new synthesis. Sadly, the US federal government withdrew the US end of the EU-US funding and the bid could not be submitted but some good relationships were made and for a future EU bid the EU universities in the Club (and perhaps those associated with the Club) would be natural partners, building on an existing bid relationship with the EU members of the BELEUSA team. One non-EU institution could even be the Third Country partner. Quality issues An invitational workshop at Athabasca University provided the financial wherewithal to make a visit to TRU as a side-trip and also put the benchmarking into a wider quality agenda. Changes at the University of Leicester As the work of the Club drew to a close, both the senior staff at the University of Leicester moved on to other jobs, Professor Salmon to a new senior role at the University of Southern Queensland. It was agreed in consultation with the project manager at Leicester that the University of Leicester would not be able to supply their report to the other members of the Club since the direction that the University would now take in distance learning would not necessarily be in line with the results of the benchmarking report. In due course I shall review with the incoming Head of BDRA, Professor Grainne Conole (of the Open University) what role the University might like to play in benchmarking activities in future conveniently she and I are senior collaborators in the POERUP project funded by the Lifelong Learning Programme. Final meeting of the Club We used the opportunity of an invitation to Sweden from KTH to me to organise a final meeting of the remaining active members of the Club at KTH in Stockholm on 21 September 2011. Lund University and Thompson Rivers University attended. The University of Gotland

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

were also invited since it was felt that though originally outside the Club their experience had been similar and would be of value, especially since this gave all three Swedish institutions involved in Pick&Mix a good base on which to build future collaboration in and beyond Sweden. It was decided at the meeting that the three Swedish institutions and Thompson Rivers University would share with each other reasonably detailed versions of their internal reports including scores and that a public document (longer than this summary) would be produced by myself with anonymised scores (set in the context of anonymised scores from other users of Pick&Mix) and a thorough bibliography. Future papers and workshops are also under consideration. Conclusions on methodology Three of the four original Club institutions had no substantial issues with Pick&Mix. Lund found it more challenging to use the Pick&Mix criteria without modification. However, two other Swedish institutions, KTH (also a research-led institution but smaller than Lund) and Gotland (a small university college with lots of distance learning) managed to cope very well with Pick&Mix. My conclusion is that, as in the early days of the benchmarking exercise, the style of institution can have a large impact on acceptance. Lund is a very large, decentralised and student-centred institution and UK experience suggests that such institutions do find the Pick&Mix style of benchmarking more challenging and less appropriate. It also may be a factor that the lead benchmarker at Lund, a good academic colleague of mine, is doing her PhD on benchmarking and this is likely to generate more comments on methodological aspects. Thus I feel no need to make any substantial changes in Pick&Mix. However, some changes in emphasis may be required and the fact that the student-related criteria are found near the end of the table can give the impression (wrongly) that they are less important. Universities in the Club and others using Pick&Mix can rest assured that in using Pick&Mix they are using a living methodology which (outside the US) has been used in more institutions in the world than any other benchmarking methodology. The University of Northampton is currently using Pick&Mix, and Swedish institutions with my help are formulating a bid for a Baltic Benchmarking Club, based on the model of the Distance Learning Benchmarking Club with an expertise node in several of the Scando-Baltic states. The variant moods of Pick&Mix (for IT, OER, employability etc) are also an active area of development. Benchmarking outputs Three key conclusions from the benchmarking exercise are that: 1. only TRU has any strong expertise in market research for e-learning, not only in the Club but out of all the institutions I have benchmarked 2. no institution has strong competence in costing e-learning 3. staff reward and recognition for e-learning is not adequate anywhere. Further details will be in the full report. Resources In the full report there will be a fuller bibliography based on an update of that at http://www.virtualcampuses.eu/index.php/Bibliography_of_benchmarking but the work in Mendeley is beyond the scope of the project and is not likely to be complete until late in 2011.

Bacsich

23 September 2011

BENCHMARKING FOR DISTANCE E-LEARNING

See the work in progress at http://www.mendeley.com/groups/1075191/benchmarking-elearning/. KTH (2011) report Benchmarking e-larande vid KTH with substantial sections in English http://www.kth.se/polopoly_fs/1.82335!benchmarkinkbilagorfinal.pdf Lund (2010), Benchmarking av e-lrande vid Lunds universitet projekterfarenheter och kritiska framgngsomrden fr kvalitetsarbete again with sections in English http://www.ced.lu.se/about-us/publications/ossiannilsson-landgren The only public description of the Club so far is in Pauls invited presentation to the ENQA workshop (Sigtuna, Sweden, October 2009) on quality in e-learning see www.enqa.eu/files/ENQA%20PaulBacsich%20final.pdf For the latest beta version of Pick&Mix see http://www.maticmedia.co.uk/benchmarking/PnM-latest-beta.xls (note that the first 20 criteria are the so-called core criteria). For more general reading on Pick&Mix see http://elearning.heacademy.ac.uk/wiki/index.php/Pick%26Mix For more general reading on benchmarking see http://www.maticmedia.co.uk/benchmarking.htm and http://elearning.heacademy.ac.uk/wiki/index.php/Methodologies_Index For a bibliography of benchmarking see http://www.virtualcampuses.eu/index.php/Bibliography_of_benchmarking . This is currently being updated via Mendeley see the groups http://www.mendeley.com/groups/1075191/benchmarking-e-learning/ and http://www.mendeley.com/groups/1087051/quality-of-e-learning/. For more on the Re.ViCa project see http://revica.europace.org and also see its successor the VISCED project which resides on the same wiki http://www.virtualcampuses.eu Appendices to the final summary report These include the ACODE concordance and the other concordances done.

Bacsich

10

23 September 2011

Anda mungkin juga menyukai