Draft of paper for ASCILITE 2006 conference, Who's learning? Whose technology?

[learn more] [submit abstract]

Authors: Margaret O'Connell, Robyn Benson & Gayani Samarawickrema


The draft of our paper below requires minor changes after being blind reviewed...

Abstract: This paper examines the critically reflective approach used by a group of academic support staff in designing, developing and evaluating an e-learning resource for others. The resource was a showcase of examples of electronic learning and teaching approaches developed at Monash University titled Designing Electronic Learning and Teaching Approaches (DELTA). The approach included individual and collective reflection, dialogue and action drawing on the features of participatory action research. One component of the evaluation was explicitly designed as a participatory evaluation so that the outcomes of this process could be formally accommodated in data collection. The paper presents this approach as a model for e-learning developers to monitor and progress their own professional development, engaging in collaborative dialogue to enhance their professional practice.

Keywords: Teaching and learning strategies, educational paradigms, research methods and approaches, learning communities/collaborative learning, personalised learning

Introduction

As universities adopt new technologies to support teaching and learning, staff development for pedagogically appropriate use of these technologies becomes imperative. Epper and Bates (2001, p. xv) describe staff development and training as a ‘daunting challenge’, and others (Bates, 2005; Kulski, Boase-Jelinek & Pedalina, 2002; Taylor, 2003; Shephard, 2004; Wilson & Stacey, 2004) have repeatedly brought attention to the need for professional development in the area.

Teaching academics benefit from support to translate their teaching into non-linear, flexible, collaborative, e-supported environments and to gain confidence to use the technology. Educational designers and staff developers can help teaching academics to reconceptualise what they do and to use technology effectively. In recognition of such a need, an exemplars WebCT site titled Designing Electronic Learning and Teaching Approaches (DELTA) was developed by a team of academic support staff involved in educational design and academic professional development at Monash University, Australia. DELTA demonstrates good practices in e-learning by showcasing examples of and ideas for learning and teaching with technology. DELTA was presented within WebCT Vista (the University’s learning management system) to support the time-poor teaching academic, facilitating broader, flexible and ‘on demand’ academic staff development opportunities as part of a strategy to develop a University-wide suite of online and offline support opportunities to complement WebCT training. The principles that guided this approach to staff development included reflective practice for iterative development of strategies for learning and teaching with technology, mentoring in the area of new skills development, learning from demonstrations by colleagues, and cross-faculty sharing and exposure (Benson, Samarawickrema & O’Connell, 2005; Samarawickrema & Benson, 2004).

This paper explores the experiences of the academic support group involved in the design, development and evaluation of DELTA, in the context of participatory action research. It examines this approach as a model for professional development of academic support staff working in areas of innovation and e-learning where formal professional development programs are few. The individual and collective reflections of the group, articulated through dialogue, contributed to the professional development of the group members themselves. While the conversational framework developed by Laurillard (2002) provides one way for conceptualising this experience, the emphasis on the empowering aspects of participant collaboration embedded in the concept of action research (Carr & Kemmis, 1986; Kemmis & McTaggart, 1988) offers a further dimension for exploring the implications of participation for professional development.

Professional development and participatory action research

The use of action research as a model for staff development in higher education is not new (Kember & Gow, 1992; Grundy, 1995). Webb (1996, p.59) noted a decade ago that ‘Apart from phenomenography, action research is perhaps the most influential and almost certainly the fastest-growing orientation towards staff development at the present time.’ Action research is particularly applicable to staff development because it supports critically reflective thinking about one’s own practice, is grounded in the principles of teamwork and collaboration to forge new meanings from experience, and offers a clear framework for acting on these (Brookfield, 1995; Carr & Kemmis, 1986; Kemmis & McTaggart, 1988). Although its roots in the critical theory paradigm (with ideas of change through emancipation and empowerment) may seem far removed from the context of a small e-learning development team, Brookfield (1995), among others, has highlighted the relevance of critical pedagogy in understanding our roles in education. He refers to critical reflection as an ‘illumination of power’ (p.9): it allows us to understand how power frames and distorts our educational processes and interactions and to question our assumptions and practices that are taken for granted as being good for our teaching, while participation provides an avenue for making our thinking public. Other benefits include its support for taking informed actions, developing a rationale for practice, avoiding self-blame, emotionally grounding us, enlivening our classrooms, and increasing democratic trust. Action learning principles have been acknowledged as important in professional development for e-learning (Ellis & Phelps, 2000), and there has been some recognition of the advantages of action research for professional development in this area (McPherson & Nunes, 2004), but there appears to be scope for wider application of these ideas and practices.

It became clear from the early stages of the design and development of DELTA that the conversations we engaged in as we expressed our design and development priorities, or debated the merits or otherwise of particular examples for inclusion, were exposing our pedagogical values, extending our thinking and creating shared ownership of the decisions made. Hence, in the tradition of Freire (1972), it was evident that we were demonstrating how ‘humans in communication are engaged actively in the making and exchange of meanings, it is not merely about the transmission of messages’ (Evans & Nation, 1989, p.37). We realised that our dialogue was a vehicle for our own professional development as well as offering clear directions for action. Consequently, when planning the evaluation of the site, it seemed obvious that one strategy should involve a participatory process to facilitate the formal collection of our own critical reflections in order to include these, alongside data from other sources, to inform its ongoing development.

Implementing the participatory evaluation process

In extending the concept of participatory action research to participatory evaluation we were acknowledging the close links between these two forms of enquiry (Greenwood & Levin, 1998; Jackson & Kassam, 1998; Patton, 2002). Participatory evaluation as a ‘formal, reflective process [people undertake] for their own development and empowerment’ (Patton, 2002; p.183), provided us with a way of documenting our individual and collective perceptions of the site, as a means of reaching consensus on priorities emerging from the evaluation. While participatory evaluation is frequently applied in a community development context, our use of it at a micro level appeared appropriate to the team-based nature of our work, allowing us to move from individual reflection, to identification of areas of consensus through dialogue, and then to prioritisation of the actions to follow. The process we used was as follows:

  1. Collectively identify aspects of the DELTA site for evaluation.
  2. Individually write a 200 word response to each of the (five) aspects identified, summarising each response in one or two sentences.
  3. Compile, circulate and reflect individually on the compiled responses.
  4. Meet in a focus group facilitated by a critical friend to identify consensus items.
  5. List separately the consensus and non-consensus items and prioritise the former for action.

This process exposed the values of individual members of the group in a non-threatening way, allowing for a merging and reconceptualising of shared understandings. It also provided for group ownership of the priorities for action, thus simultaneously supporting both the evaluation and professional development of the team members. For example, during our focus group discussion, we reached consensus on aspects related to the quality and selection process of examples in DELTA. We agreed that examples should be realistic, achievable, exploit the unique capacities of the technology, establish the learning context, demonstrate good pedagogy, engage learners and address their needs, and be identified by intuitive titles. Consequently, we validated the existing examples as well as confirming the selection process for future examples. On the overall site design, we shared the view that the ability to browse from different user perspectives was needed, and improved search capabilities, which led to refining those design features. There was also consensus that as a resource for academic professional development, DELTA’s use varied according to user needs (confirmed through other evaluation strategies), and that the process of selecting examples was indeed a professional development activity in itself. Considering the evaluation questions collaboratively reinforced a shared accountability in the changes we made to DELTA and the value of learning from each other by developing and refining our individual ideas about e-learning.

Discussion and conclusion

The process described above illustrates how participatory evaluation provided a form of data collection that allowed our own merged understandings to be considered alongside data from other respondents through other evaluation strategies to improve the site. It helped formalise a process which we had recognised as an informal participatory action research cycle during the dialogue which underpinned its design and development. By formalising, documenting and managing the process we not only owned our individual contributions but also consented to the way in which these activities were carried out, thus taking ownership of the process as a whole. From ownership comes empowerment, a powerful motive for change. Brookfield (1995) notes that changes that occur from participatory action methods consequently generate personal or life change for participants. In our case, undertaking such an approach as part of the evaluation meant we could take responsibility for our own quality monitoring of DELTA and its ongoing development as a key outcome, providing an implementation method to allow action to follow.

The team-based nature of e-learning development lends itself to participatory action research as a model for professional development, particularly given the volatile state of emerging knowledge in the area, its contextualised nature, and limited formal professional development opportunities.. The context of developing a professional development resource for others, from the experiences of others, offers the potential of an ever widening circle of participation with new understandings emerging through reflection and dialogue. Consequently, in relation to our own professional development, the answer to the question ‘who’s learning about e-learning from whom?’ is, to some extent, that we are learning from each other. However, the processes of participatory action research and participatory evaluation take the learning to another level: the sharing of knowledge and values results in the making of new meaning so that rather than learning from someone, we are learning together, sharing experiences, drawing from and contributing to an existing knowledge base that benefits the wider e-learning community.

References

Bates, A. W. (2005). Technology, e-learning and distance education (2 ed.). London: RoutledgeFalmer.
Benson, R., Samarawickrema, G., & O'Connell, M. (2005). Showcasing examples of good practice in e-learning: an opportunity for research in distance education? In T. Evans, P. Smith, & E. Stacey (Eds.), Research in Distance Education 6. (pp.71-82). Geelong: Deakin University. http://www.deakin.edu.au/education/rads/conferences/publications/ride/2004/doc/8BensonSamarickwemaOConnell.pdf [viewed 20 July 2006].
Carr, W., & Kemmis, S. (1986). Becoming critical: Education, knowledge and action research. Lewes, East Sussex: The Falmer Press.
Brookfield, S. (1995). Becoming a critically reflective teacher. San Francisco: Jossey-Bass.
Ellis, A., & Phelps, R. (2000). Staff development for online delivery: A collaborative, team based action learning model. Australian Journal of Educational Technology, 16(1), 26-44. http://www.ascilite.org.au/ajet/ajet16/ellis.html [viewed 9 July 2006].
Epper, R. M., & Bates, A. W. (Eds.). (2001). Teaching faculty how to use technology: Best practices from leading institutions. Westport, USA: Oryx Press.
Evans, T., & Nation, D. (1989). Dialogue in practice, research and theory in distance education. Open Learning, 4(2), 37-43.
Freire, P. (1972). Pedagogy of the oppressed. Harmondsworth: Penguin.
Greenwood, D., & Levin, M. (1998). Introduction to action research: Social research for social change. Thousand Oaks, CA: Sage.
Grundy, S. (1995). Action research as professional development. Perth: Commonwealth of Australia.
Jackson, E.T., & Kassam, Y. (1998). Knowledge shared: participatory evaluation in development cooperation. West Harford, Conn: Kumarian Press.
Kember, D., & Gow, L. (1992). Action research as a form of staff development in higher education. Higher Education, 23(3), 297-310.
Kemmis, S., & McTaggart, R. (1988) The action research planner. Geelong: Deakin University.
Kulski, M. M., Boase-Jelinek, D., & Pedalina, V. (2002, 5-6 February). How can we stay in front of the online learning eight ball? Professional development for tomorrow's university teachers. Paper presented at the Teaching and Learning Forum, Perth, Australia.
Laurillard, D. (2002). Rethinking university teaching: A conversational framework for the effective use of learning technologies (2nd ed.). London: Routledge.
McPherson, M., & Nunes, M. B. (2004) Developing innovation in online learning: An action research framework. London: RoutledgeFalmer.
Patton, M.Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage.
Samarawickrema, G., & Benson, R. (2004). Helping academic staff to design electronic learning and teaching approaches. British Journal of Educational Technology, 35(5), 659-662.
Shephard, K. (2004). The role of educational developers in the expansion of educational technology. International Journal for Academic Development, (9)1, 67-83.
Taylor, J.A. (2003). Managing staff development for online education: A situated learning model. Journal of Higher Education Policy and Management, 25(1), 75-87.
Webb, G. (1996). Understanding staff development. Buckingham: SRHE & Open University Press.
Wilson, G., & Stacey, E. (2004). Online interaction impacts on learning: Teaching the teachers to teach online. Australasian Journal of Educational Technology, 20(1), 33-48.

Acknowledgement
We acknowledge the other DELTA development team members, Dr Charlotte Brack, Debbi Weaver and Jan Williams, who shared the experiences described in this paper.

Copyright © 2006 Margaret O'Connell, Robyn Benson & Gayani Samarawickrema.
The author(s) assign to ascilite and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author(s) also grant a non-exclusive licence to ASCILITE to publish this document on the ascilite web site (including any mirror or archival sites that may be developed) and in printed form within the ascilite Conference Proceedings. Any other usage is prohibited without the express permission of the author(s).