Further reflections on Cluster B questions

The Cluster C projects (OULDI, ViewPoints from University of Ulster and PiP from Strathclyde) were invited yesterday to Cluster B’s CAMEL meeting to share the tools and resources that we have been developing. Cluster B includes the Predict project from City University, T-SPARC at Birmingham City University, UG-flex at the University of Greenwich, PALET at Cardiff University, and the Course Tools project at the University of Cambridge. The common theme for these projects is flexible curriculum design and approval, and they are working towards a common process map. For more on the meeting itself follow this link to ….(I’ll add the link when I’ve got it!)

Prior to the meeting the cluster briefly overviewed the Cluster C tools and mailed the following focus questions to us:

  1. What philosophy have you used for curriculum design?
  2. Did you have an idea about what tools to develop initially or did you undertake consultation first?
  3. How do your tools encourage collaboration between staff to undertake holistic curriculum design?
  4. Where is the balance between technology and face to face tools for your projects?
  5. Do your tools create a pathway for people to follow or are they tools they can use as and when?
  6. Which aspects of the tools did staff respond to most positively?
  7. What would you do differently now?

I have embedded my slideshow below which I think broadly answers the questions but would like to use this blog post as an opportunity to reflect further on questions 3, 4, 5 and 6 specifically because they expose some of the key developments and challenges from the last year.

Increasingly, projects (including our Cluster partners) seem to be realising findings – as we have been – which suggest that although the tools, representations and resources are useful in themselves, their real value comes when they act as ‘mediating artifacts’ between people during the design process. In the meeting yesterday Catherine (PiP) mentioned discussions they had been having with a project based at Curtin University of Technology who have developed a similar set of tools and processes to those developed by PiP and who are also increasingly finding that there is little benefit in individuals designing in isolation, and have been using the tools to frame/ mediate discourse between individuals within and across course teams. ViewPoints’ evaluation strategy is focused at exploring in detail the impacts of the tools on discussion and design dialogue in the workshops that they have been running with course teams. (Question 3 - How do your tools encourage collaboration between staff to undertake holistic curriculum design?) They have noticed that small changes to the way the tools are presented will impact on the subsequent discussions – for example they have redesigned their course template to fit on a table because they found that discussion was richer when participants grouped around the table than around a template stuck on the wall. The physical placing and feel of the cards also seems to get people talking – building consensus, sharing ideas and developing knowledge. Groups are reporting that that the activities help them in building a sense of community and a sense of ‘shared output’ or shared ownership of the courses. Similarly the feedback we get from our ‘Design Challenge’ workshops invariably sites the opportunity to discuss designs with others most often as a benefit of the event:

“The thing I enjoyed the most was that we had the chance to work in teams on other people’s modules. I really enjoyed chipping in ideas and suggestions to my team, and helping out with the design of another person’s module. This is something which it seems to me would be good more often?”

“Thank you for running a whole day event. I think we needed this time to really become familiar with the motivation and nature of the course and to develop close cross curricular links with other colleagues, a secondary but immensely valuable side product”.

There are two key challenges that spring to mind that come out of this in relation to our project, the first around how we might begin to evaluate our tools in terms of impact. We can discover quite easily whether people thought they were useful by asking them in interviews or surveys, but it is much harder to define impact on practice of a tool or representation because the tool or representation in use is inextricably linked – can only be understood in relation to – a more complex activity system which includes all the things Engeström (1987 p. 78) suggests  – subject (student?), object (course?), division of labour, rules, community and also the impact of interactions with other activity systems especially where they are those of  ‘different’ people i.e. students, employers, media services, library etc. So although we can say (in answer to question 6. - Which aspects of the tools did staff respond to most positively?) that the Course Map, Pedagogy Profile and the ViewPoints assessment cards are most often liked, it is much harder to say what the impact of those tools might be, except by offering contextualised examples of use (i.e. http://cloudworks.ac.uk/cloud/view/3813 and http://cloudworks.ac.uk/cloud/view/4612 ). The OULDI project is running 8 pilot case studies over 4 institutions across 3 years so we are confident that we will have lots of contextualised examples of tools working (and importantly not working) but analysis of these, so that we can make reliable recommendations about use, will be a challenge.

The second issue that comes out of our findings about how groups interact around the tools/ artefacts relates to how we might replicate this physical interaction in on-line environments. Clearly we feel Cloudworks has a role to play here, but although there is a very good level of interaction on the site about aspects of a design (a tool, a pedagogical model, a model of assessment) we are not yet much seeing groups collaborating on, or sharing complete, learning designs in the site – this is an identified area of focus for the next 12 months. ViewPoints are similarly challenged with their proposed online tool and have decided that actually what they need is a simple transcription tool.

So the answer to question 4. – Where is the balance between technology and face to face tools for your projects? is that it is kind of the wrong question – it doesn’t really matter whether a tool is online or on paper (our Pedagogy Profile online widget works as well, if not better, in workshops as the paper version), the key thing is that groups are able to manipulate the tool and discuss together what they are doing and why – at the moment that is most effectively done in small face to face groups around a table.

The final question that interested me was 5.- Do your tools create a pathway for people to follow or are they tools they can use as and when? I think our findings about how the ‘background factors’ (team/ institutional culture and strategy, skills, attitudes, resources, relationships, etc) impact on the design process and the resultant designs strengthen our position that there is not/ can not be ‘a pathway’ but instead different routes through to the same place. Our tools and resources have been developed to fit into a single activity, or a repeated/ iterated single activity, or they can be put together with other tools and resources to create a bespoke, contextualised and negotiated ‘pathway’.  This means that the tools and resources can be ‘dropped into’ existing institutional curriculum design processes at key points, or equally be used as part of more innovative institutional curriculum design process such as the one developed by the T-SPARC project where an ePortfolio system is being used to collate supporting evidence of design activity, which then supports appraisal and validation. In this case the course teams can then are then, with the support of a design mentor, negotiate the appropriate and ‘fit-for-purpose’ tools and resources they are going to use to mediate, inform and represent their design process and from these capture outputs for their portfolios.


Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research (Helsinki, Orienta-Konsultit).

Engeström, Y. (1999) Innovative learning in work teams: analysing cycles of knowledge creation in practice, in: Y. ENGESTRÖM et al (Eds.) Perspectives on Activity Theory, (Cambridge, Cambridge University Press), 377-406.

Comments are closed.