Case 2: Coeducate Project

Part of the work of the Coeducate project was to support the institutional wide revalidation of the undergraduate curriculum. The process and framework under which this activity was undertaken was devised by a committee with representatives of different parts of the institution. I was included in my role as the manager of the Coeducate project that was already working to improve processes and technical systems, and staff capability towards developing work based curricula [SO1]. The motivation for the curriculum revalidation was driven by the perceived issues set out in the table below, with the proposed actions to remedy them. As would be expected, this was a contentious initiative, however, this is not the focus of this discussion.

Summary of Undergraduate Curriculum Re-validation Rationale


The re-validation timeframe was challenging, running for one academic year. At the outset, there was a significant dissemination and communication task to explain the rationale behind the changes, the process that was being developed, and the timeframe for the activities. This package of work was the responsibility of the quality assurance unit.

Initially, email was used to distribute documents, but as would be expected, this was far from satisfactory with numerous version control issues and missed communications. To this end, I developed an online repository or portal to coordinate the re-validation activity including holding the key documents and schedules of activity [SO3]. The simple approach taken was to create a site in Moodle, ‘This space is open to all University of Bolton staff.  It contained the definitive documents and guidance for the re-validation process and was the first place to seek clarification from the team responsible for the implementation of the re-validation activities.’ In addition, a frequently asked questions was created in an attempt to reduce the workload on senior staff responsible for the validation, but also to ensure that there was a common message. I believe that this is relevant to the SO3, as the coordination of effort around such a large institutional activity, that required significant staff development, is important.

The above was operationally necessary to ensure the smooth running of the revalidation process. However, from an academic development perspective there was another opportunity; the re-validation process could be used as a vehicle to raise the staff curriculum design and development capabilities. The re-validation process as initially envisaged had little thought for how staff would be supported in the work of writing modules and programme specifications. Although not stated explicitly,  the assumption was that staff already had the skills to bring a design perspective to developing coherent programmes and modules. This included the writing of learning outcomes, and that they were familiar with basic concepts like constructive alignment (Biggs, 2003). My experience working as a chair on validation programmes and teaching on the Postgraduate Certificate in Higher Education had given me first hand experience of this and I arrived at the conclusion that there was the need for support in this area [SO1]. The reality was that little attention had been given to developing staff skills as curriculum designers and, as such, there were many myths and uniformed practices at work across the institution.

From the perspective of the Coeducate project, the proposed remedies listed in the table above were used as our starting point for identifying goals for academic development [SO1]. As a project, we came up with the idea of a series of Innovation Support Networks that were designed with the aim of empowering colleagues to take advantage of the revalidation to improve their curriculum offerings rather than simply treating it as a tick-box exercise [SO2-3]. The idea, although not fully realised, was that these networks would be sustained by participants beyond the lifetime of the revalidation activities for which they were directly developed. This longer term view fitted with the aims of the Coeducate project to build institutional capability to develop responsive curriculum, rather than a quick fix view of training. The Innovation Networks developed were:

  • Rethinking your Curriculum (Bill, Stephen Powell and Tracy)
  • Module Specifications and Programme Design  (Jane and Stephen  Powell)
  • Innovating around Employability
  • Innovating around Environmental Sustainability
  • Innovating around Professionals in Practice
  • Innovating around Internationalisation

For the Coeducate project, this was a challenging time as in many ways events had overtaken the original staff development aims of the project that were focussed on the development of work-focussed learning were overtaken by the institutional wide re0-validation activity. On a personal level, this was disappointing, but on a positive note having the opportunity to develop and deliver staff-development activities that addressed a real and immediate need was rewarding. In contrast to the staff development activity described in case study 1, this work was subject to a formal evaluation process (report). The extract below summarises the evaluation of one of the activities that I led [SO4].

session evaluation.jpg

The evaluation shown in the chart above was traditional in its approach (Hossler, Godden and Hossler 2015, p.233) and could be described in terms of Kirkpatrick’s (1959) four levels of impact as level 1 – a measure of the participants reaction, which in this case was a positive one. For the narrow institutional purposes and the political purposes of the professional development unit, this was sufficient. However, a more sophisticated approach that took a developmental lense would have fitted with the philosophy the Coeducate project, “A theory of change is particular to the context in which it was developed, and will change as the initiative and accompanying evaluation themselves evolve in response to feedback and other program data.” (ibid. 232). The Coeducate project was based on a theory of change based in soft systems methodologies (Checkland 2006), a type of action research from a systems thinking perspective. This approach recognises the complexity in which change and improvement activities sit, and uses a framework for action and evaluation built on collaboration.

Thinking back, I find it shocking that such a large initiative was undertaken based on little more than the the opinion of the senior managers with little or no evidence base. At the time, through the Coeducate project, I undertook some basic analysis of the module database to create a baseline against which to evaluate the redesign of the modules learning outcomes and assessment. This work highlighted the variety between modules in terms of the number of learning outcomes and the volume of assessment. I believed this was a good starting point for some deeper thought and institutional conversation about the nature of any improvements or changes required. However, the supertanker had already set sail and there was little that could be done to bring evidence to bear as a means of shaping the process in such a way that it addressed the institutional, staff, academic and most important student needs. In my opinion, before embarking on projects of this scale an evidence based should be collected and a collaborative process developed that includes all of the stakeholders as there is a significant risk unintended negative consequences of having a one size fits all approach.

Biggs, John. 2003. Teaching for quality learning at university. Buckingham: Open University Press/Society for Research into Higher Education. (Second edition)

Checkland, Peter, and John Poulter. 2006. Learning for Action: A Short Definitive Account of Soft Systems Methodology and its use for Practitioners, Teachers and Students. Chichester, West Sussex: John Wiley & Sons.

Hoessler, Carolyn, Lorraine Godden, and Brian Hoessler. 2015. Widening Our Evaluative Lenses of Formal, Facilitated, and Spontaneous Academic Development. International Journal for Academic Development 20 (3): 224–237.

Kirkpatrick, D. L. (1959). Techniques for evaluating training programs. Journal of ASTD, 11, 1–13.