Wednesday, 26 March 2014

Evaluating TEL

Part of the UCISA conference is always dedicated to hearing case studies from colleagues at other universities who share their experiences with us. I went to one this afternoon from a colleague, Sarah Horrigan who worked with us in Sheffield until last year. It was about evaluating institutional Technology Enhanced Learning (TEL) practice at her new institution.

The basic question was, how do you know what good practice looks like? You can ask people, look at your system, or do both.You can run reports from your VLE, but do they tell you whether technology is enhancing learning?

We need digitally literate staff and students to really have TEL. Digital literacy defines those capabilities which fit an individual for living, learning and working in a digital society.
At her new University they had what they called threshold standards for using the VLE ( eg every course must have ....). Sarah looked at the threshold standards, combined with the digital literacy definition and came up with a rubric for module evaluation. Then a selection of modules were audited. The results were not surprising to those of us who work in HE, and I suspect would be replicated in many institutions.

They discovered lots of tumbleweed moments! Lots of modules set up on the VLE, with no content.
There was also lots of stuff in VLE, powerpoints, module handbook etc. But no interaction, no communication.
They had a threshold standard that said the folder structure in VLE should match scheme of work. But lots of modules didn't have scheme of work. Setting up standards to fail. There was no prioritisation of standards
and where the standard was prescriptive, (eg put on handbook on line), there was more compliance, but where the standard was vague, there was more exemplary practice.
They also found gaps in digital literacy and particularly there was very little skills support.

So, what are the recommendations?
Talk to academic community. Find out what's important to them.
Replace the threshold standards. Look more at a framework of enhancement.
Facilitate via design of the system. Build the boring stuff into the VLE
Develop digital practice skills. Have a training menu, let staff picks what they want and deliver itto them, where they are.
Repeat the processes to deepen understanding. Don't assume what people are doing, they probably aren't!

Sarah finished with a nice quote: "Direction is more important then speed. We are so busy looking at speedometers that we forget the milestone. ". Keep going in a positive direction. That's more important than how fast you get there.

In the same vein of sharing case studies, the last session was a poster one, where twenty or so case studies are shared via a poster with the authors on hand to answer questions. Quite a buzz around the posters this evening, and it gave us another opportunity to talk to suppliers.







- Posted using BlogPress from my iPad

1 comment:

Ros Walker said...

This replicates some of the findings that we had when we looked at VLE use in schools. At the worst level, they were completely empty. Just as bad, they were 'silos' for paperwork, that no one would really want to look at anyway. The best were full of up-to-date relevant materials, which were directly relevant to the needs of the student and provided grounds for engagement with the course. At the end of our project, some of our schools did away with their VLEs altogether (*shock*) and used online tools that were freely available! We did also have some great examples of lively, engaging VLEs which students actually enjoyed visiting and being a part of. Interesting work!