Monday 15 October 2007

Measuring the success of WebPA

Last week I attended a JISC workshop on evaluating JISC funded projects. Although a lot of the evaluation seemed to be steered towards the student experience, it did raise a number of points related to evaluating the project. These are especially valid for those projects where a large proportion is related to the development of software.

In the past I have sat in no end of meetings where evaluation has been discussed, and the software is quite often overlooked, or in many cases the evaluation of software is not understood. I have seen no end of project submissions where the evaluation for the software produced is out lined in terms of "the code will be peer reviewed" but as an opensource project that will happen over time and will be carried out by the community that develops for the project rather that a nominated project or organisation. Anyway getting back to the software the evaluation should be in relation to how widely the system is now used and the attitudes of those using the system.

One phrase that came out of the workshop which has stuck with me is "Metrics for Managers". I think this is a good place to start to highlight the types of information that can show the success of WebPA (after all it will be a success). For those who are interested in what WebPA can do for the institution we need to look at the student retention rates, rather that the rather softer student experience. By evaluation the retention rates we may be able to show that with the use of WebPA for peer assessment, the retention rates increase. However, this may be far in advance of what can be done in the time remaining for the delivery of this project.

There are other measures that we can use for the evaluation of WebPA, link the ones that I have highlighted in the past with out even thinking about evaluation. We are already able to collect data related to the download rate of WebPA, we just need to be able to marry this back in to the number of Institutions across the UK and then by sector that have adopted WebPA. This is more difficult that it may seem to begin with. For a start yes we can find out the number of downloads for the software we are receiving (12 to date), but due to the privacy policy for Sourceforge I can not find out any information about who is downloading. This then leads on to the second element, how will we know who is adopting WebPA either on an institutional level or individually in their teaching. The only what of measuring this will be the metrics that we can gather from other areas of the sourceforge system and form contact that we have with institutions and academics. But I do think that these metrics will be in some cases difficult to obtain and also miss out some vital information, however, they will be useful anyway.

Even though evaluation is normally left till the end of the project I think the JISC timing for this event was appropriate. I think that it will trigger project teams to start thinking about what they can evaluate and how, in turn this will enable the teams to start collecting data sooner, rather than scraping around for information and trying to make it fit towards the conclusion of the project funding.

No comments: