Saturday 5 July 2014

Three views: Enhancement and evaluation of TEL

#OCTEL - WEEK 6 Webinar


The Week 6 ocTEL webinar offers three views on enhancing and evaluating technology in learning. Those viewers with a particular interest could perhaps just watch a third of the webinar, that is, what most relates to them/their work.

View 1: iPads in Science  


The first presenter, Mark Kerrigan (Anglia Ruskin University) focuses on an iPads in Science initiative, which is described in: Mobile learning: How mobile technologies can enhance the learning experience (pp.30-38).

In brief, students were loaned iPads, could customise things on the tool to suit themselves (i.e. treat like your own but within the terms of use), and could take them into laboratory sessions in specially prepared and tested protective covers. One aim was to support employability skills aligned to best practice, and included pre-, in- and post-laboratory activities in the learning design.

View 2: Video enhanced learning  


The second presenter (at about 15' mark), James McDowell (University of Huddersfield) focusses on his award winning work for the Most Effective Use of Video, regarding the enhancement of student learning through his use of video feedback. James' profile page (click on hyperlinked name above) shows a list of publications and/or presentations related to his work (including his work with others) on video enhanced learning. (Unfortunately, this webinar presentation would have been better to connect to live, as the short videos that James shows via Blackboard Collaborate do not show in the recorded and YouTube accessed webinar.)

The presentation that is linked to from the ocTEL week 6 site, titled Using video feedback in formative assessments, seems to be a precursor to the presentation James uses for the ocTEL webinar. Specifically, in the webinar James seems to have reworked the slides he shows (e.g. slide 6 in link) to a focus of:

  • VEL: video enhanced learning
  • VEF: video enhanced feedback 
  • VEA: video enhanced assessment.

With the following examples (heavily summarised):
  • VEL: production of a video repository
  • VEF: developing a video feedback loop system to promote engagement with feedback via video
  • VEA: learner generated video vignettes to enable reflection and self-assessment via portfolio.
Here is a conference paper I have found on James' work with video:

McDowell, J. (2011). Using asynchronous video to promote learner engagement through the enhancement of assessment and feedbackProceedings of the 6th International Blended learning conference, Hertfordshire. June 15-16, 2011.



View 3: 3E subject design in (e.g. in LMS/VLE)


The third and final presenter (at about 30' mark), Keith Smyth (Edinburgh Napier University), focusses on the 3E Framework for technology-enhanced learning, teaching and assessment, now adopted and adapted by over 25 institutions.

Keith draws attention to a comprehensive website on this framework, which also contains an excellent and also comprehensive document:

Smyth, K., Bruce, S., Fotheringham, J. and Mainka, C. (2011). Benchmark for the use of technology in modules; [based on the] 3E Framework designed and developed by Keith Smyth. Edinburgh Napier University, Edinburgh, Scotland.



This document is a great resource, from the principle of not just using technology at the minimum 'Enhance' level, but, where desired and appropriate, to further develop learning at the 'Extend' and 'Empower' levels. See the image harnessed from this document below, which gives Smyth's definitions for the 3 E's.


In closing this post


It is a couple of weeks since I viewed this webinar, and the ocTEL course is finished, therefore, I will leave this post as above, as an overview of the three examples of TEL embedded with useful, related inks, and let the webinar via YouTube speak to the evaluation side (which is attended to briefly given three presenters within one hour, but still useful).

All three guest presenters offered valuable TEL examples, and I look forward to reading further on their respective works.

Sunday 29 June 2014

3rd and final peacock time: my #ocTEL badges for Weeks 4-6

One week since I 'officially' completed ocTEL, I still have not yet made a couple of final blog posts. As each could decide their own outcomes for ocTEL 2014, my minimum requirement quickly established itself to be to:

  • complete the course from week zero through six, including at least:
    • the 'If I only do one thing...' weekly activity and post a blog response
    • watch the recorded weekly Webinar (plus additionally to ocTEL guidance: post my blog reflections). 
I did not stray far from this self-imposed minimum, apart from participating in a small number of extra readings/viewings, and a few discussion entries/questions/responses on the forums.

My two final ocTEL blog posts involve showing-off badges from the final weeks (this post), and my reflections on the week 6 Webinar (soon - still in pen-n-paper note format).

Peacock by Jacob Mee is licenced under CC BY-NC-ND

So, without further ado, here are my remaining ocTEL badges:

Final

Week 6



Week 5




Week 4




Friday 20 June 2014

A MOOC focus; How to enhance the effectiveness and efficiency of particular TEL approaches

#ocTEL Week 6 'If you only do one thing...' activity


Education offered on a massive scale is available, but it is currently more facilitative, and option-providing, rather than the disruptive all sector changing thing that it was (not long ago) celebrated/feared to be.

I watched both the ocTEL suggested videos listed below. Both videos are promotional in style, and so extol benefits clearly, however, some issues can still be drawn out from the presentations.


The table below identifies a range of elements of the Saylor and xMOOC approaches aligned with that of my academic college (faculty), within a large Melbourne university.

My context:
College of Science, Engineering and Health
Saylor Foundation
xMOOC model
(e.g. Udacity)
Learning mode
Primarily blended learning.
We have a substantial reliance on on-campus laboratories (medical, health, pharmaceutical, computer, engineering, experimental)
Online

Asynchronous
Online

Asynchronous
Access
Primarily exclusive application via Victorian Tertiary Admission Centre, reliant on proven school academic ability (ATAR score) or equivalent experience
Open

Anyone with internet access can learn
Open

Udacity only partly free now; cheaper than Stanford
Learning design
Value good learning design, but it is not mandated or routinely evaluated on a broad scale; small number of courses each year can apply for learning design projects for their courses
Learning design is critical for all courses, “design [is] the transformative marriage of form and content”
A ‘professor consultant’ designs a course, and it is peer reviewed by other prof., sometimes from various universities.
A simple but clearly structured design based on navigation through OER readings and recorded lectures, doing assignments that have answer guides, sitting a final exam online. All is presented ready for the learner upon commencement.
Intended to equal the quality level of a 15 HE course.
Learning design appears in this presentation to be OK in quality, albeit probably fairly rudimentary and consistent.

Navigation through a xMOOC^ is “linear and based on the absorption and understanding of fixed competencies”, “centre of the course is the instructor-guided lesson”



Learning design features
Extremely varied across the blended modes, from standard, traditional features, through to high-end interactive contemporary models of engagement
Standardised approach.
OER readings and recorded lectures (using (CC) or if required seeking permissions), “vetted, assembled, and found as trustworthy by experts”
If any gaps, original readings or recorded lectures (presentation media with Paint tools to annotate as discussing) by Prof. (CC) BY
xMOOCs typically:
·      concise, targeted video content
·      automated testing to check understanding as working through content
·      discussion forums to “bounce ideas around and discuss learning together”, but has to be loud noise to get tutor input
·      learning is seen as something that can be tested and certified
Educational technology/Learning resources
Wide range of institute SOE for educators to choose from, e.g.: LMS (Blackboard 9.1) and all its features, web conferencing (Bb Collaborate), Google suite, Virtual labs and iLabs, WebLearn (sophisticated purpose-built mathematical enabling quiz and assessment tool); plus a range of ‘non-sanctioned’ technologies.
LMS: Moodle
Anticipated soon:
·      ePortfolio to allow tracking progress and receipt of badges
·      media library for robust searchable repository open to world
Udacity:
·      focus on quizzes
·      solve problem (e.g. coding problem; answer question); if correct: advance, if not go to:
·      video explanations
·      student discussion / collaboration forums (can post anonymously)

·      final exam (online)
Learning support
On-campus and online teacher:student, student:student interaction.
Services, e.g. Study and Learning Centre, physical and online library services, student union support, admin support, etc.
Unclear
It seems that after design and development, educators are no longer required for a teaching role.
Unclear
No teaching role mentioned beyond course creation.
Sharing
Not really. We put © on nearly everything we produce.
From a ‘create-hold-on-tight’ position, we are moving to using OER, and then hopefully contributing back.
Yes.
Not just a consumer of OER, but contributing back OER as well, using (CC) BY licence.
Limited courseware accessible for free but no reference to sharing back to OER community.



Problems anticipated with a MOOC-like approach

A problem anticipated with both the Saylor and Udacity approaches, is the potential lack of support for students. They are both relying on quite ‘step-wise’ designed learning; not all students can successfully navigate steps with the same efficiency or effectiveness. Some will experience hiccups and require guidance and/or support to get back on track. 

Additionally, I expect the designs talked of in the videos are not engaging enough to connect with a very wide range of students (which might contribute toward low retention rates in xMOOCs..?) While cMOOCs are potentially more engaging (as this ocTEL one is!), each has its drawbacks, as noted in ocTEL listed ref: What is a MOOC? What are the different types of MOOC? xMOOCs and cMOOCs

There has been a lot of discussion about MOOCs not being the answer to everything, including—it would seem—one of the founders of Udacity (see Jason Lodge’s blog post, Nov 2013, ^The failure of Udacity: lessons on quality for future MOOCs). It is also instructional to read the comments to this blog.


Various situations in which to consider efficiencies 

1. Reaching more people or providing a richer experience for the same cost?

  • Reaching out to provide education to a global market.
  • Potential tensions between: providing education effectively for a range of different learners, with different contextual, cultural and learning expectations. Is there time to adequately conduct a full international LNA (learning needs analysis) and adapt the course, or, better still, to properly internationalise the curriculum and train academics how to teach in an international context?

2. Reducing tutor costs by encouraging more elements of the learning experience to be peer-based or self-organised by learners?

  • Computer science students each record a video of their specific learning evidence (programming problem analysis and arrival at potential solutions), via explanation and presentation of coding, and other artefacts, etc. Then, within groups of three or four, they need to each review all other’s videos in their team, against criteria aligned to the course learning outcomes, to collectively contribute to arriving at individual marks. If results vary greatly, the tutor can moderate. Each student also needs to include in their review at least one thing new that they learnt from each video reviewed. 
  • Educators will need to make sure the students clearly understand the benefits of this intervention so that it is not treated lightly; effectiveness might be hampered if the students do not understand the 'why'. Perhaps a visit, or recording of a visit, from an industry representative speaking to the benefits of undertaking such learning activities, e.g. regarding the employability skills it might enhance. 

3. Reducing production and infrastructure costs by using free resources and technology?

  • Using a flipped classroom model where content is provided in the form of OER video lectures or  content explanations, made available online for student viewing prior to on-campus tutorial sessions.
  • The conflict in this is that it is potentially shifting mediocre delivery-styled teaching methods from on-campus to online. Very bright students tend to cope despite whatever is dished up, but most students will require structured learning designs to properly engage them with the video content. Simple things like chunking the videos into smaller bites of content (if (CC) licence allows), interspersed with meaningful reflection and engagement activities to better facilitate adequate preparation before on-campus tutorials.

4. Taking the learners’ perspective in getting a sound, rounded, education with minimum financial outlay?

  • Traditional but often effective on-campus SCCs (student consultative committees), could be replicated with web conferencing tools like Bb Collaborate.
  • Definite conflict! I don’t know how the welcoming gesture of tea, coffee, biscuits, could be replicated through the www! Doh!

Thursday 19 June 2014

Two views: Learners as leaders; Leadership and working in partnership

#OCTEL - WEEK 5 Webinar


Week 5 ocTEL webinar suffers a little on sound quality, but I will report on a range of gems in this post.

Gems from view 1: Learners as leaders


Ellie Russell, Projects Officer for The Student Engagement Partnership, NUS (National Union of Students), talked to engaging students in curriculum design and TEL projects.

Ellie refered to Trowler, V. (2010), re three types of student engagement, which I found on Google Scholar as:

Two resources by Vicki Trowler, one co-authored by Paul Trowler, recommended to be read in conjunction:
Trowler, V. and Trowler, P.  (June 2010). Student Engagement Evidence Summary; Deliverable 2 for the Higher Education Academy Student Engagement. Department of Educational Research, University of Lancaster.
Trowler, V. (Nov 2010). Student engagement literature review. Department of Educational Research, Lancaster University.


Ellie noted three types (or 'foci' or 'axis' in Trowler & Trowler) of engagement:

  • student's own learning focus
  • identity
  • structures and processes
and how 'structures and processes' is the least represented in student engagement, including in processes of curriculum design and design changes. Ellie challenges us to focus in receiving a student voice in such structures and processes, and to take on the idea of students as partners. She notes that it is new in a policy priority to have this, but that students should be able to have some sort of influence on their learning environment.

NUS has published 'A manifesto for partnership'.

Ellie calls for us to reject a 'consumer model' as this devalues educators, and instead offer a new way for students to think about learning (in a balance of not just putting forward wish lists). 
'Rethinking apprenticeships':
  • attendance to university to gain mastery and spend time with experts to do so
  • wary of excessive engagements as they may not know what they want (I'm assuming this is a warning against reliance on connectiviist styled learning?).



Argument
Response

“Students can't be equal partners as they don't have the expertise”
We need input of the student perspective
Students are apprentices in the business of engagement, and they need support and mentoring to be effective as partners
Goal: preparing students for active, engaged citizenship; not a life of passive consumerism

Ellie supports her presentation with examples of student involvement in curriculum design, including:
A quick Google shows there are many more, including more on the Higher Education Academy (HEA) site, several noted in this Guardian article.


Gems from view 2: Leadership and working in partnership


Shân Wareing, PVC for Learning and Teaching, Bucks New University, shares leadership concepts and application to elearning in higher education.

Shân had lots of tips for project management (PM), and related them nicely back to other ocTEL learning/resources, such as Julie Voce's presentation in the 'If I only do one thing...' activity. Below I will highlight just a few tips.

Collaborative approach:

  • get the 'flow' right
  • ego goes in a drawer
  • listen to others, no right/wrong
  • 1 person working hard does not equate to getting many to work well
  • risk of failure if you don't hear all the voices, e.g. administrators, etc.; don't hear them and it can be a complete ****-up.
PM view of/role in project:
  • see the wood more clearly than the trees, i.e. be able to see individual contributions, efforts, troubles, to note if any action required
  • watch for needs of the individual as well as the collective
  • watch for 'psychopaths' or those in for the individual not for the team.


Referring to the diagram above (simple re-draw from Shân's diagram), with the following text:
  • left triangle: 'Hands-on; keeping things moving'
  • right triangle: 'Looking and thinking ahead'.
While leaders need to aim to move on a left to right trajectory, sometimes they have to go left, get their hands dirty, check things, etc. 
When things are:
  • running well, stable, with the right people in post, confident about their roles, then work more on the right
  • unstable, inefficient, negative environment, or in rapid change, then work more on the left.
Further tips shared by Shân include:
  • have project objective and project scope both clearly documented and communicated
  • be clear about deliverables, resources, etc
  • use project methodologies
  • use PM to serve ends (not PM as an end)
  • don't put too much emphasis on the PM system and not enough on the people and project difficulties.
Examples shared by Shân include:
  • Jisc DIAL (Digital Integration into Arts Learning) - a digital literacy project
  • University of the Arts London, My Assessment, online assessment  and marking project (there was a backlash in this project, which was only saved by having the last point below^.

And now I will finish with Shân's shared principles of partnership:

  • respect for people, their experience and knowledge
  • clarity of roles and expectations
  • listening and sharing information
  • timing: combining speed and patience in the right proportions (balance); art to knowing when which is appropriate
  • ^clear sense of purpose.

Leadership. Management and Keeping on Track

Part B of #ocTEL Week 5 If I only do one thing... 


In Part A, I compared some of the experiences of Lisa Carrier and Julie Voce (both of Imperial College London) in their respective educational technology implementation projects. In particular, strengths and weaknesses, and lessons learnt, as generously shared by these two professionals.

I continue the ocTEL Week 5 'If I only do one thing...' activity by using guidance from Jisc’s Project planning: Project management site to think about a project I have been involved with and consider a range of issues, as noted by sub-headings below. Additionally, and where relevant, I have used Lisa and Julie's (or other) frameworks to aid my answers.

Blended Learning for Updating to Environmentally Sensitive Refrigeration Systems

I am deliberately choosing a smaller scale project to reflect on here, and one which had an external consultant as the Project Manager (I am already verbose; this might help me rein my words in; plus it had some explicit faults I can reflect on...). The project required designing and creating blended learning to support a new learning program. The project was won on tender, and the program was to retrain existing refrigeration engineers and technicians to work with new large-scale refrigeration systems using refrigerants less harmful to the environment. These previously qualified people were expected to have a range of different levels of experience with out-going systems, but would need to learn new equipment, techniques and safety issues.

While I could have blogged about bigger, better organised projects, I actually have fond memories of this project,  and enjoyed this moment of reflection. Additionally, at the time I achieved limited closure as (1) the project continued after my involvement, and (2) as I wasn't PM, I didn't complete reports on project, etc. 

Who were your stakeholders?

A few of the stakeholders tabled below could arguably shift to different cells. For example, the specific supermarket chain the students came from was kept at arms length by the specific educational supplier contracted to engage between the chain and the institute but would feel the impact of poorly trained employees; the Program Manager, Discipline Head, and Head of School were difficult in choosing 'primary' or 'secondary'. Referring the the latter example, and using Lisa Carrier's referral of primary stakeholders as people who would experience benefits and failures directly, I've probably placed them reasonably correctly in this instance.


Internal
Primary educator
Designated Project Manager (PM)
Course (subject) educators
Me (learning designer)
Graphic designer/Web developer
Program Manager


Discipline Head
Head of School
My learning and teaching department manager
College deputy director of vocational education*
Current/future students in discipline area

External
Future students
Refrigeration accrediting body
Specific educational supplier contracted to engage between institute and:
Specific supermarket chain

Grocery/supermarket Industry
Potential future workplace employers (transferable skills)


Primary
Secondary



What resources were used?
Staff input as related to all the items below: see primary, internal stakeholder cell above, plus* 

Time for analysis of potential learners and learning needs, and curriculum requirements
Time to design learning before commencing development
Educational technology and tools available in university SOE: 
  • LMS (Blackboard):providing navigation and overview of program; 'Fridgee forum' discussion forum; contact with teachers; linking to course cluster learning in Google Sites; assessment detail an submission; ejournal and blog tools)
  • Web and interactive creation tools to create discrete learning clusters of related courses (subjects):Google Site embedding: tabs including overview and topic tabs; short videos; Google Forms to set formative assessment questions
  • Desktop video capture and video editing (Echo360) supported by PowerPoint presentation and documentation for display:Various recordings, e.g. for first cluster: principal teacher discussing the purpose of the course cluster and training and industry links, supported by PowerPoint presentation and documentation; specialist in vocational education discussing the purpose of ongoing training in, or as preparation for, employment
  • Digital camera and graphics and animation software (Illustrator, etc.):Creation of interactivities to familiarise students with the equipment before attending for intensive on-campus days, e.g. animated learning activities controlled by student, graphically enhanced images for formative assessments such as labeling in Blackboard quiz features
  • Plus more...
Time to develop, review, adjust
Time to test functionality (spot users) 
Time to evaluate learning (peer review only; no time unfortunately for student review pre-implementation; learner evaluation post-first implementation)


How clear/achievable was the project plan?

Mixed. Neither the project objectives nor project scope were explicitly documented and shared with the team; a verbal brief was given by the PM. There was no explicit documentation providing "the project plan, ...[description of] the project management framework, including project organisation, reporting relationships, decision process, and the role of any management committee" (Jisc, 2014). There was an expectation that we all knew what to do; just get on with it.


While the PM did not, or was unable to, provide clear written objectives or overall project scope, he did however provide a list of courses (subjects) required for development (subsequently chunked into two learning clusters), which were then easy to isolate curriculum standards for. The learning designer and primary educator were able to document the learning objectives for the first stage of the project clearly, largely due to the primary educator who knew the industry and his prospective students very well, and clearly understood the curriculum goals, which the learning designer documented - first as a scoping document, then as a full design document.


A potential negative impact from not being privy to the overall scope was that a further set of courses was only identified after project commencement. Not only did this effect team members' vision of the whole project, but this further set of courses related to courses within the original set (but providing another stream of learning), and planning in conjunction with the first set clearly would have been preferable. The positive out of this was that we already had a model designed and developed to suit the industry and the anticipated learner cohort from the first 'stream' set, and could adapt this model to the subsequent work, and hence clawed back some time saving there.


The project milestone due dates were glaringly macro, as in simply dates that the first student cohorts would be commencing study, dates on-campus, and dates completing. Therefore, the operational team established finer granular milestone dates within these macro dates.


What fallback position, if any, did you build into your plan in the event of full or partial project failure?

Dates were already agreed with the client before the team was established, and contracts signed, so an extension of time was not possible. The fallback was to have all learning taught on-campus and support resources provided as a take-away pack if the online materials were not ready. If this fallback position was required, we planned to provide some of the prepared online items to students in a class, support them to interact, and take note of their feedback to incorporate improvements for the next cohort.


OK, so that might have been an option, but not one acceptable to the key team members. Which meant that in actuality, we worked long hours into the evenings as major due dates loomed. (Not a sustainable solution.)


What methods did you use to evaluate your project?

The PM with the client administered specifically designed student feedback forms, which were available to both parties for 'post inaugural run' evaluation and were largely brief but positive.  Complimenting this was a valuable formal client feedback report via a senior experienced client employee trained in the first student cohort. Again, largely positive, but with a good amount of detail and some minor points of improvement noted. Anecdotal teacher feedback - including anecdotal student feedback via him - noted small glitches to fix, otherwise was overwhelmingly positive. The learning designer investigated student engagement and teaching support via learning artefacts in the online learning environment, and apart from varying levels of ability (e.g. surface reflection through to deep reflection), was pleasantly reassured by strong levels of activity across the class coupled with evidence of support provided by the teacher.


How did you measure project success?

Primarily by the student experience (as per previous answer). 

A second, delayed post-training employer and employee (former students) satisfaction feedback survey could improve evaluation, albeit not always easy to administer.


Did you celebrate your success and did this encourage further developments?


Additional course clusters were required/added to the project, but my time allowance on the project was up, and a consulting 'instructional designer' was employed to work on the final stage. Therefore, while I feel like there was closure as key milestones were achieved, I wasn't about the project office for the end stages. However, a 'thank you' was provided by the PM, by coffee at a nearby cafe for key team members at my departure point.


Finally...


While a number of things in this project could have been better communicated, planned and implemented, it was regardless an enjoyable project to work on and key team members (within the school plus support staff) still use the elearning produced on this project as good examples for others.

"Many aspects of project management come down to good planning and common sense but there are real benefits to learning and using a proper methodology" (Jisc, 2014). While this project goes to show that a small bunch of good people committed to create good things on time is a powerful force, this is not always guaranteed without good planning and supported and timely execution. Even with good people with good intentions, I can imagine how much better it could have been had we also had all the planning principles and practices in place.

The Jisc site offers Wikipedia links to a range of methodologies, such as Project Management Body of Knowledge (PMBOK), agile software development and flexible product development, plus a link to PRINCE2. Where I have lead projects, I have used institute established project management and reporting tools (quite simple but effective, with a RAG flag component), and trialled Microsoft Project, but I prefer institute tools combined with creating tracking documents in Microsoft Excel, supported by Google Drive for sharing detail and progress, and team contribution. I am about to commence a project that will use agile project management and lean product development methodologies, and I'm looking forward to experiencing this. However, I suspect I might be a better organised project leader if I undertook some formal PM training, or had a PM mentor while undertaking a specific project.


Tuesday 17 June 2014

Leadership. Management and Keeping on Track

Part A of #ocTEL Week 5 If I only do one thing... 


Two people from Imperial College London, Julie Voce, and Lisa Carrier, generously shared their ed tech implementation projects with the ocTEL community (via presentation tool PANOPTO utilising Microsoft Silverlight):


It seems that both projects were successful overall (and interesting), albeit each experienced some tricky issues and shared lessons learnt from the process.

CS1 involved the comprehensive project of determining what VLE to use (after WebCT was consumed) from an institute wide perspective, including short-listing four products before piloting two in trial sites and selecting and implementing a new VLE.

CS2 established a new blended learning Master course (allergy specialisation for health professionals) in an institute that did not previously offer online learning.

Below I have tabled some of their respective successes shared, then I have grouped issues discussed (including points of failure) and lessons learnt as there were some like themes to compare and contrast across the cases. These may be heavily summarised, so I recommend visiting the presentations for follow-up detail.

Some successes

Theme
CS1: New institutional VLE
CS2: New blended learning program
Project objective
Achieved and delivered within 1 month of time plan
Achieved and delivered on time and on budget
Analysis
Thorough evaluation of VLEs (to confirm/question deliverables)

Outcomes
Set-up and migration to new VLE to 500+ courses within 4 months
Innovative template inspiring others; positive recruitment, retention, etc.


Some factors of focus re potential failures/issues and lessons learnt

Theme
CS1: New institutional VLE
CS2: New blended learning program
Process
Plan development of specifications;
Plan each stage = CRUCIAL
Have project objective clearly defined plus project scope;
Planning is everything, use processes and tools and work out dependencies;
Complete design BEFORE implementation
Commun-
ication
Consult widely in institute (staff and students) and allow all a voice;
Consult with vendors and other institutes;
Some in institute unaware institute VLE review took place
Map and consult with stakeholders: external, internal; primary, secondary; power matrix; etc.; give them all a voice;

Time scales
Review at commencement took longer (students, as key stakeholders, unavailable)
Only 3 months from employment to delivery (meant no time for pilot)
Project Manager role
Need dedicated PM (not half a role)
Need dedicated PM, and to begin at the start (not part way through)
Project team
Get commitment from committees and key stakeholders
Team roles need to be clearly defined; have project board and team structure;
Don’t assume resources available (e.g. assuming academics are free to help); Map stakeholders and secure buy-in
Resources
VLE review time consuming and expensive process
Main cost = staff time
Pilot stage
Conducted pilot of two comparison tools in trial space (not effecting student’s study)
No time to carry out pilot stage
Risks
Test systems to ensure vendors promises are accurate
Evaluate risks; have contingencies