Robert L. DilworthAction learning is increasingly a prime vehicle for developing teams, "jump starting" organizational learning, promoting leadership development and even transforming an organization's culture. Used in a number of corporations, including their corporate universities, it is also finding its way into institutions of higher learning in the United States and elsewhere. However, there is very little research targeted at how action learning is viewed by learners and how the group dynamics within teams dedicated to action learning unfold. Virginia Commonwealth University (VCU) has given the evaluative area of action learning concentrated address. In Spring 2000, VCU used a modified version of the Global Team Process Questionnaire (GTPQ) to map group dynamics in action learning sets, the first use of the GTPQ in an academic environment.
Keywords: Action Learning, Evaluation, Higher Education
Principal Methodology: Both Quantitative and Qualitative
For the past five years, the Adult Education and Human Resource Development Master's Degree Program at Virginia Commonwealth University (VCU) has been striving to build evaluation techniques specific to the action learning experience. Action learning has become an important part of curriculum design. Dilworth (2000) reported on action learning programs at six universities, including two outside the United States. At VCU, students in the aforementioned Master's Degree Program are encouraged to think deeply about their experience in dealing with a complex, real-world problem that they are asked to solve as part of the capstone course in the program. The students do this as part of an "action learning set" of four to six members. They are asked to keep "learning journals". At the end of their semester-long experience, "students submit an extensive individual report on the action learning process, group dynamics and personal lessons learned" (p. 529).
Why is action learning different than a usual team related undertaking? VCU uses core principles of action learning in its program that are found in varying degrees in most other action learning programs.
- The problem to be addressed by the set (or team) is real and in great need of address. It is not fabricated in any way.
- While it is expected that a solution to the problem can be developed and acted upon, the larger yield is learning itself. The real problem becomes the fulcrum on which critically reflective learning processes occur. The goal from a human resource development standpoint is to develop people who are capable of leading, problem solving, working effectively in teams, and thinking critically in building the long-term strategic capabilities of the organization.
- Action learning must lead to action (Marquardt, 1999). "Merely producing reports and recommendations for someone else to implement results in diminished commitment, effectiveness, and learning. . ." (p. 33)
- Emphasis is on questioning inquiry (the "Q" in the parlance of Reg Revans) as opposed to excessive dependency on "P", standing for programmed instruction (Revans, 1983, p. 11). Revans argues that in a rapidly changing environment we should begin with the "Q" (what is happening, what ought to be happening, and how do you make it happen?).
- The set has no assigned leader and customarily operates as a self-directed work team with responsibilities shared.
- Emphasis can be on moving learners away from what they already know, assigning them to work on problems that no one in the set has any great familiarity with. This can lead to fresh questions (the "Q" factor) and a re-examination of basic underlying assumptions. In this format, members of the action learning set are usually assigned a common problem to deal with. In other approaches, the individual set members may have individual problems they work on that are taken from their respective workplaces. In the latter case, the problem will probably only be familiar to the set member studying that issue (unless an entire natural team is committed to problem solution), thus creating an environment conducive to questioning inquiry.
Problem StatementThere needs to be greater attention given to the evaluation of learning that is taking place, as well as group dynamics, in an action learning experience. This is not an area that is well covered, in part because the focus can center on task accomplishment versus learning. Action learning also has many variations in application, and determining how best to evaluate group dynamics and learning across action learning experiences can be problematic.
Theoretical ConstructAction learning traces its origins to action research and Kurt Lewin (Weisbord, 1987).
Lewin intended his enhanced problem solving model to preserve democratic values, build commitment to act, and motivate learning-all at once. Indeed some people have renamed the process action learning to more accurately indicate its nature. (p. 87)
The work of Reg Revans over the years has given primary shape to action learning, including refinement of the dimensions inherent in the process (Revans, 1983, 1982, 1980). Those dimensions are highlighted in the introduction to this paper.
Action learning is increasingly finding its way into corporations, often as part of their corporate university.
Rather than simply send high potential managers to external executive education programs, these organizations are developing focused large-scale customized action learning programs with measurable results. (Meister, 1998, p. 15)
The Global Team Process Questionnaire (GTPQ) was created by ITAP International, Inc., an organization doing wide-ranging consultancy with corporations world-wide. The fact that the GTPQ is indexed to the global arena makes it doubly attractive as a vehicle for evaluating group dynamics in an action learning set, since sets can often have multi-cultural content. This is certainly true of corporations today, and it can also be true of the university setting. For example, the Adult Education and Human Resource Development Master's Degree Program of VCU partnered with the University of Salford in England in 1996 in organizing an action learning program for US, Canadian and Australian students. (Dilworth, 1996, 1997, 1998, 2000). Further, students in the VCU program in recent years have come from 17 nations.
The GTPQ is a well-established instrument in terms of wide application since its inception in 1993. It has been used extensively (over 30 administrations with global teams), with pharmaceutical companies as well as in the chemical, consumer products and information technology industries (Bing, 2000). Thoroughly tested in a variety of environments, specifically by peer reviews at the end of the process, results have shown that the team with the best level of process (as indicated by the GTPQ) was also rated as producing the highest quality results. The GTPQ is a diagnostic tool which measures process changes over time on global and distance teams. It has also been used for intact, local teams.
Research QuestionsThe research questions all stem from one overriding proposition, namely that you can evaluate group dynamics and learning processes in action learning sets. Evaluation of group dynamics in teams is not new. What is new is an attempt to map group dynamics and effectiveness of the learning process within an action learning experience in juxtaposition. The specific research questions are:
- Can a modified version of the GTPQ be used in an academic setting to map group dynamics and effectiveness of learning in an action learning experience?
- What can administration of the GTPQ tell us about the internal dynamics of an action learning set?
- What barriers occur in an action learning experience that can stand in the way of the learning process?
- What positives and negatives do the participants in the action learning experience ascribe to action learning?
Methodology and Research Design
In partnership with ITAP International, the GTPQ was modified to fit the academic setting and obtain information specific to the action learning experience. Most changes to baseline questions were minor (e.g., reference to class versus corporate setting).
The following specific questions were asked in the modified questionnaire. Where a slight adjustment has been made to fit the classroom setting, one asterisk appears. When the question is unique to the particular experiment in evaluating action learning, a pound sign appears. All other questions listed are baseline questions used without modification. Respondents use a six level Likert Scale in assigning a value (favorable to unfavorable). In some questions, "6" is high and in other cases, "1" is high.
- Within your team, please characterize the distribution of work among team members over the recent past (equal to unequal).
- Have your skills and capabilities increased through participation in your team?
- Do you have time for work on your team's activities?
- Is the agenda of your team clear? (Clear vs. unclear)
- Are the roles of the team members clear? (Clear vs. unclear)
- How effective is the work of your team? (Effective vs. ineffective)
- (*) Have you had the opportunity to inform others in the class of the work of your team? (No opportunity or need vs. provided a presentation to another group)
- (*) Have you had the opportunity to learn of comments on your work team from others in the class (No vs. quite a bit)
- (*) How do you rank the importance of your team to your own future career success? (Of central importance vs. of little or no importance)
- (*) Is your future career success likely to be positively affected by the team's work? (My future career success will remain unchanged or degraded, to there is likely to be a positive benefit to my future career success)
- Group communications (Excellent to poor)
- Describe the level of trust on this team. (Strong to weak)
- (#) Describe the level of support provided by client(s). (Highly supportive to not supportive)
- (#) The degree of learning occurring in this course experience vs. other courses you have taken. (Much higher to much less)
- (#) The extent to which you find this experience challenging vs. other learning experiences in an academic setting. (Much less challenging to much more challenging)
- (#) How did you find operating in a virtual team environment (i.e., much of the interaction by Internet and telephone vs. a collocated team at a single site? [One action learning set dealt with a client team over 1,000 miles away]
- Identify a barrier that stands in the way of your team's work.
a. With respect to your contributions.
b. With respect to internal team productivity.
c. With respect to factors, outside the team's control. [These required open-ended response vs. Likert Scaling]
- (#) List four positives and four negatives in priority order of your experience with action learning thus far. [Students provided open-ended entries]
Two action learning sets were involved in this experiment. One set of five consisted of four females and one male. The other set had three female members and one male member. Administration of the Honey-Mumford Learning Style Questionnaire (LSQ) helped determine the set to which a given student would be assigned. An effort was made to mix learning styles and backgrounds in arriving at action learning set composition.
The larger team was involved with a major examination of how professional development programs needed to be designed and promoted for 500 faculty and staff at a large local community college. The other team dealt with a major project for the corporate university of a major company based in the Mid-West. Their study centered on evaluating how to measure delivery of learning programs. They had on-site visits at the beginning and end of the project (February and April 2000), handling research and interaction with the client team via virtual means in the interval between visits.
Adding to the value of this experiment was the fact that earlier evaluation processes were left in place as the GTPQ was administered. Students kept their learning journals, served as a focus group in discussing their experience as it developed during the semester, submitted a 15 to 20 page end-of-semester essay providing an assessment of the action learning process and group dynamics, and submitted a five to seven page end-of-semester essay on their personal learning. Their personal learning was pegged to critical incidents criteria. (Dilworth, 1998) The GTPQ was administered twice during the semester. The first administration was done after the team had been through a month of intensive effort and had a chance to develop some group cohesion. That became the baseline index. Three months later, the second administration occurred as the projects drew to a close. It was therefore possible to compare baseline results with the second administration of the GTPQ, do a gap analysis and map trends. This could in-turn be compared with the other evaluative processes used. Following each administration, ITAP International determined qualitative results via computer analysis and recorded qualitative results.
The action learning sets were given a composite/matrix profile of the overall team averages and range for each question asked, together with individual team member scores for each question. Since all completed their questionnaires independently, there was no way of identifying who was responsible for a given score. The results showed relative alignment in some cases (all scores close to the same) and areas where there were significant perceptual disagreements within the set. This created a basis for meaningful discussions within the set in reviewing and "fine tuning" the group dynamics. It served to open up discussion in areas that might otherwise have been undiscussable.
Each action learning set saw the complete results of the other team as well as their own. Each set then served as a set of "consultants" to the other team in helping them sort through the findings, in determining what they meant and how the team needed to address the findings.
What limitations were there to the research? As with any such investigation, the mix of participants can heavily influence results, no matter how good the methodology or basic learning design (in this case action learning). However, the care in administration of the GTPQ and the forms of triangulation present (e.g., comparing narrative student comments in their essays to GTPQ results) did serve to create a means of interpreting the significance of the results.
Results and Findings
1. Qualitative comparison of GTPQ results with other evaluative reference points (essays) suggest strong congruence, and that is reasonably to be expected. Both record the same experience.
2. Team 2 started with a relatively low profile in terms of performance based on the GTPQ. It then surged based on results of the second GTPQ. Team 1 had a much stronger initial profile. It then slipped back somewhat based on the second administration of the GTPQ, but retained rather high marks across the board. When the second administration of the GTPQ is compared with the first for each team, it reveals the following trends across the question categories (Likert-based items).
3. GTPQ results for both administrations reveal very strong positive ratings for both teams in the following categories:
a. Work distribution.
b. Use of time.
c. Team agenda clear.
d. Team member roles clear.
e. Team effectiveness.
f. Opportunity to inform others.
g. Opportunity to learn from others.
h. Future career success positively affected.
j. Learning in the academic course vs. others.
k. Experience challenging vs. other classroom experiences.
4. The high quantitative results on the GTPQ are mirrored in the quantitative results generated by the usual faculty evaluation form used at the university to assess quality of instruction and the learning experience. Results were uniformly at a median level of 5 (on a Likert Scale, with 5 as the maximum rating). Student essays also reflect the very positive student evaluation of the experience.
5. When set profiles were examined they identified some disparities of view within a set/team. Deviations of two or more Likert ratings between set members in a given area demonstrated to the set that there were some potential problem points in group process. One example of this was when four of five members of a set rated quality of communications to be excellent, while one member rated this area low. Another area of divergence noted was related to time. Four found adequate time to work on the project and one did not. Such differences are important to know. They provide a basis for intervention strategies within the set itself to alleviate concerns and improve group process.
6. While students placed high value on the learning, they also cited barriers to the team's work. They were invariably time related. Some verbatim student comments were:
a. Family time is reduced.
b. Time: Balancing work and this project because of the time spent on the project. I feel as though I almost need to be "on leave" from work in order to do extensive research, meet with the client, prepare presentations, etc.
c. Trying to meet at a convenient place and time for all members (not a very big barrier though!)
d. Attendance and punctuality has marginally interfered with productivity, but may have undermined group process (e.g., communication and cohensiveness. . . ).
7. What is shown below is a representative sampling of student comments re: the most positive and negative aspects of the action learning experience (Number 1 in the priority order, of four asked for)
Good team cohesiveness
People from different areas and backgrounds working together
More camaraderie than traditional courses
Working with the team and coming from a variety of experience
Communicating within loop-sharing concepts with all for new ideas
It's good (to a degree) that we are unfamiliar with the client organization-fresh perspective
Developing friendships with team members
Confusion as to which group member should answer questions when asked by the client
Decision making can be a long process
Difficult to coordinate logistics
Must rely on all members, i.e., have to wait when one or more are late
Overwhelming, easy to lose confidence in project
Having to coordinate with all even with minor issues
The client didn't seem to embrace, by way of deep introspection, action learning
Difficulty getting a real grasp on what is expected of our team (deliverables)
Conclusions and Recommendations
1. To use the GTPQ effectively requires that the team have time to form and coalesce before administration of the instrument. In an earlier pilot test by the researcher, the teams involved did not spend extended time together. Therefore, team members did not feel any real vesting in team performance. Further, there was no single project focus as was true in this action learning experience. In the action learning experience, the project work was tightly bounded by time. How well the group did played an important part in determining the individual student grades.
2. When used in situations such as the action learning experience, where the work tempo is intensive and success of the team depends on good group dynamics, the GTPQ seems a very powerful tool. It also provides a basis for the team to target on specific areas that can interrupt or impede team effectiveness. That allows the team itself to deal with such problems. Since problems have been made evident by the team members, themselves, through anonymous completion of the GTPQ, the issues are made authentic and legitimate. It is worth noting that when results are excellent, good performance can end up being further bolstered (reciprocal causation). If an external facilitator, on the other hand, were to identify problems to the group based on observations of group activity, that would not tend to carry as much weight. It would be group process as seen through the eyes of someone not a continuous part of that process, rather than the inner conscience of the group.
3. When the GTPQ is slightly modified and wedded to the action learning experience through use of some open-ended questions, it can be doubly useful in an action learning context. The members of the action learning set receive not only quantitative feedback in this case, but qualitative as well. What this suggests is that the GTPQ can be made even more effective and useful by including questions customized to the context involved. This can be especially useful in dealing with cross-cultural situations where an understanding of cultural nuance can be important.
4. To an indeterminate extent, the GTPQ itself served as an effective intervention, in that its administration caused members of the action learning sets to consider a number of areas that are critical to the effectiveness of any group/team (e.g., clarity of team member roles and the level of trust within the team).
5. Use of the GTPQ in this instance, since it was used in tandem with several other evaluative sources (i.e., two essays, professor observation, class as focus group and the usual faculty/course evaluation at the university), provided a means of triangulating the results.
6. In responding to those question areas unique to the classroom version of the GTPQ, students rated the action learning experience more challenging and of higher learning value than other university courses they had taken. In terms of challenge, one set assigned an average value of 5.4. The other set averaged 4.5 (with a Likert Scale rating of six being the maximum in this instance). In terms of learning compared with other courses, one set assigned an average value of 2 and the other 1.75 (the best Likert Scale rating being 1 in this case).
How this Research Contributes New Knowledge in HRD
1. It shows how a proven survey instrument can be further strengthened through customization and the addition of a qualitative component to go with the quantitative one.
2. The GTPQ seems to have particular utility in an action learning experience because it promotes critical reflection at both individual and group levels, allowing the group itself to determine how best to self-facilitate progress. It can also give an external facilitator a legitimate basis for helping a group work through its self-determined problem areas.
3. As the use of self-directed work teams broadens, with the need to have them truly self-direct their activities, an instrument like the GTPQ can be of great value. As indicated earlier, it can be of particular benefit in identifying areas of possible culture clash when cross-cultural teams are being used. Since some cultures are reluctant to discuss problems openly, this can be a means of getting areas of concern into the open for discussion.
Bing, J. (2000). Explanation and approach to the use of the Global Team Process Questionnaire™. An unpublished manuscript of ITAP International.
Dilworth, R. (2000). Comparing action learning programs at six universities on three continents: Similarities and differences. Proceedings of the Academy of Human Resource Development 2000 Annual Conference.
Dilworth, R. & Willis, V. (1999). Action learning for personal and transformative learning. In L. Yorks, V. Marsick & J. O'Neil (Eds.) in Management development and organizational learning through action learning in a topical monograph on advances in developing Human Resources Series, Berrett-Koehler Communications.
Dilworth, R. (1998). Action Learning at Virginia Commonwealth University: Blending Action, Reflection, Critical Incident Methodologies and Portfolio Assessment. Performance Improvement Quarterly, II (2).
Dilworth, R. (1997). Orchestration of learning style differences and other variables in an action learning experience. Proceedings of the 1997 Adult Education Conference.
Dilworth, R. (1996). Action learning: Bridging academic and workplace domains. The Journal of Workplace Learning, 8 (6), 45-53.
Marquardt, M. (1999). Action learning in action. Palo Alto, CA: Davies-Black Publishing.
Meister, J. (1998). Corporate Universities. New York: McGraw-Hill, Inc.
Revans, R. (1983). ABC of Action Learning. Middlesex, England: Chartwell-Bratt.
Revans, R. (1982). The Origins and growth of action learning. Sweden: Studentlitteratur.
Revans, R. (1980). Action learning: New techniques for management. London: Blond & Briggs.
Weisbord, M. (1987). Productive Workplaces. San Francisco: Jossey-Bass Publishers.
©2000 Robert L. Dilworth, reprinted with permission.
This paper was presented at the February 2001 national conference of the Academy of Human Resource Development.
The late Robert L. (Lex) Dilworth was an Associate Professor Emeritus of Virginia Commonwealth University in Richmond, Virginia, USA. He received his doctorate from Columbia University in Adult and Continuing Education. His specialties included human resource development (HRD), action learning, and organization development (OD). He spent a number of years involved with action learning internationally, including time spent in extensive collaboration with Reg Revans and Albert Barker of England, as well as Verna Willis at Georgia State University. Before he entered his educational career, he was a Regular Army Brigadier General in the U.S. Army. His military assignments included service as The Adjutant General (TAG) of the U.S. Army.