Assessing student contributions to a wiki

The article Towards a Process for K-12 Students as Content Producers by John Concilus has some great ideas. I really like the way he refers back to well-founded research from an earlier era (eg on process writing and writing for an authentic audience) while discussing the impact of new technology. John’s in-depth article raises lots of interesting issues and explores how tools such as a wiki can be used in student learning without lapsing into over-simplistic promotion of the tool.

One innovation he describes which raises some fascinating issues is the WikiDashboard, which provides a way to analyse individual contributions to a collaborative wiki. WikiDashboard shows a list of users who have contributed to a page and quantitative data about the amount each user has contributed and when they did so.

wikidashboard

While this tool provides fascinating information on who has edited a wiki article, I have some strong reservations about its use as an assessment tool. My main concern is that it provides an easier way to quantify contributions but does not really provide any qualitative insights into the quality of these contributions. The danger here is related to the assessment dilemma – we tend to assess the things that are easy to measure, but these are often less important than the things which are harder to measure.

If we want to assess educational outcomes such as higher order thinking, analysis and critical thinking, we need to assess qualitative evidence. While the Wiki Dashboard is a great tool that can help an assessor find qualitative evidence, the data it provides is not in itself such evidence. It can help us find who wrote what content on a collaborative wiki, but we still need to assess each person’s contribution qualitatively and avoid any tendency to use its percentages in allocating grades.

2 thoughts on “Assessing student contributions to a wiki

  1. John Concilus

    Paul,
    Thanks for the mention of my post on the PARC WikiDashboard, and its potential.

    Although I whole heartedly agree that quantity does NOT eaqual quality when it comes to evaluating student work, I think that this tool still meets evaluative requirements for two reasons.

    The firs is that the tool allows you to quickly visualize the amount and frequency of contributions AND hot links directly to the “diffs” analysis of that work. In other words, as the teacher you can quickly click on the specific contributions each student made and analyze the quality of that work by “drilling down”. The tool in this case does not analyze, but rather makes it manageable for the teacher to do the evaluation.

    The second reason is not related to the first, but is a conceptual thing. The whole premise of a wiki system for collaborative content development relies on the “Darwinian” improvement of content through others reviewing and collaboratively evaluating what has been written. The PARC WikiDashboard –
    GIVEN enough peer or other review – shows the impact of any user’s individual content to that page. Therefore, in a “healthy” wiki, or one that has lots of “eyeballs” and/or good eyeballs on the content, the quality work that has survived the editing process should in and of itself one measure of better work. I realize that this does not replace the teacher’s role, but should be one indicator of “better” work if the wiki is a healthy one.

    In any case, I do agree with your premise that quntity is not quality, and appreciate the link.

    Regards,

    John

    Reply
  2. Paul Left Post author

    Thanks, John – good points! I agree that the WikiDashboard has a lot of potential to help analyse student contributions to a wiki.

    I often have to help lecturers develop or review assessment processes, and sometimes those new to working with web tools don’t always fully grasp at first the distinction between quantity and quality: hence my post.

    Thanks for sharing your great work with the wider education community…

    Paul

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *