Chapter Four: Workshop Assessment

Overview

Four instruments were used to gather feedback about the workshop and its effectiveness:

  1. a formative, qualitative survey conducted mid-session
  2. a summative, quantitative survey conducted at the end of the workshop
  3. an end-of-workshop group debriefing
  4. a report from each participant and evidence from the workshop project course defined in each participant's initial application

Anonymous responses to both surveys were recorded using an online survey tool. Test Pilot is effective for this purpose, but many similar programs exist for purchase.

Midpoint Survey

The first survey occurred at the end of the first week. During this week, faculty members had primarily played the role of students when using various technology tools. The survey introduced them first-hand to an online testing program. This was the final tool they saw from the student viewpoint before they returned to playing an instructional role in the second week of the workshop.

In addition to introducing participants to this tool, the survey solicited feedback regarding the content of the workshop. We wanted to gather this information at the midpoint of the workshop so that we could make adjustments as needed and finalize plans for the following week, depending upon the needs of the attendees.

Midpoint Survey Questions

In order to gain feedback regarding the participants' perceptions of the first week, we created a series of eight short answer questions:

  1. Is the use of face-to-face time in this workshop effective?
  2. Is the use of asynchronous time effective?
  3. What about this workshop is working for you?
  4. What about this workshop is not working for you?
  5. Does the pacing of this workshop enhance your learning experience?
  6. Do you have enough work time?
  7. Have you changed your intended model for your proposed course? If yes, why? If no, why not?
  8. Have any new concerns surfaced during the workshop? If yes, what?

So much of our delivery methodology was new to both us and the participants, that we felt we needed to touch base with them on how we were conducting the workshop.

We also wanted to encourage participants to think at this point about their plans for their project courses. Each workshop attendee had submitted a plan to create or modify a course of their own. We hoped to expose them to new possibilities during the week and encourage them to reflect upon their initial vision of how that course would be taught in light of their new knowledge. Question seven of the midpoint survey was therefore intended not only to measure the effectiveness of that portion of the workshop but also to trigger reflection on the part of the learners.

Validity and reliability of the questions were not tested.

Midpoint Survey Response Format

These questions were all included on one online page and respondents were able to go back to review and change responses before submission of the survey.

We made each question in the midpoint survey an open-ended question. At this stage, we wanted participants to be able to give us feedback in a free format but in a way that allowed anonymity. This allowed participants to voice their opinions and concerns honestly, or at least as honestly as can be expected with a small group of people well known to us. Our idea was to implement a sort of critical incident questionnaire, as is recommended by Brookfield and Preskill (Brookfield and Preskill, 1999, p. 49).

From these questions, we hoped to gain information that would allow us to modify the plan for the second week if the group's responses indicated problems with pacing, use of time, or workshop format. It also held the potential to point out a topic area that would prompt further discussion or sharing of resources.

Results

Of the 13 workshop participants, 8 completed the mid-session evaluation for a return rate of 61.5%. We had anticipated a higher rate, especially since the learners had a long weekend in which to complete the online survey. However, the feedback we received aided us as we made final preparations for the second week of the workshop.

We had three questions specifically asking for feedback on timing:

  1. Is the use of face-to-face time in this workshop effective?
  2. Is the use of asynchronous time effective?
  3. Do you have enough work time?

Of the eight respondents, seven (87.5%) indicated that the use of face-to-face time in the workshop was effective for them. One respondent (12.5%) indicated that the use of face-to-face time was generally effective but that the pace was too slow and the content was not sufficiently in depth. This respondent also observed that he/she was already familiar with much of the material presented.

We had similar responses regarding the use of asynchronous time.

Of the eight respondents, six (75%) indicated that they thought this time was effectively spent. One (12.5%) thought it "minimally" effective. One (12.5%) thought it "somewhat" effective and wanted to go more in depth with the asynchronous time. This respondent, however, acknowledged that the compressed time frame did not lend itself to a more realistic use of asynchronous discussion.

When we asked about the amount of work time, we also had a similar response. One respondent (12.5%) indicated that there was always a need for more work time. Based on the response, we are not sure if this person wanted more time or that the amount was enough. Another (12.5%) clearly indicated that there was not enough work time. The majority of respondents (62.5%) were satisfied with the amount of work time, and one (12.5%) indicated that there was too much work time allowed.

We asked a related question specifically about pacing, "Does the pacing of this workshop enhance your learning experience?" Of the eight respondents, seven (87.5%) responded affirmatively; and one (12.5%) said that the pace was too slow, which was consistent with responses to the use of time questions.

There were a number of aspects of the workshop that worked well for respondents. Grouped together and ranked by number of times each response was given these are:

There were fewer items mentioned that were not working for the learners. In fact, three out of eight (35.5%) indicated that there was nothing in the workshop so far that did not work for them. From the others (five out of eight, 62.5%), these were the responses:

Several suggestions for improvement were offered in this section, several by the same learner:

To the question "Have you changed your intended model for your proposed course?" we had an even split. 50% of the respondents said they did and 50% said they did not. Those who did change their models did so mostly (three out of four respondents) because they had become aware of the limitations of the technology they had planned to use to implement the model. The specific cited technologies were:

The one remaining learner who changed the intended model did so because of a change in understanding of how online and in-class activities could complement each other in a hybrid course.

The four respondents who did not change their model did not specifically say why they had chosen to stay with their initial choice. However, three of the four did say that they would modify some of the elements within the larger course structure based on their first week's experiences. This indicated to us that respondents had gained new understanding of the online learning models during the first week and had made an informed decision regarding their choice of instructional model even if it did not change dramatically. This had been a major aim of the workshop: to enable participants to understand a wide range of possible online models and choose the one best suited to their unique situation. That this might be the same one they had initially chosen simply meant that they had made a good first approximation. What we wanted was that they made that decision based on knowledge and reflection.

Finally, we asked if new concerns had arisen during the first week's sessions. Of the eight respondents, six (75%) had new concerns. Their concerns largely fell into two major, related categories: workload and time management. Two indicated that they were concerned that the administration had not yet addressed workload issues although it was apparently supportive of online classes. The workload issue was of concern because, as four respondents indicated, it appeared that online teaching could be very time-consuming.

One of the six who answered affirmatively indicated that lack of a particular technical skill was a new concern. This learner felt inadequately prepared to use the Web-authoring tool commonly used on our campus: Dreamweaver.

Modifications Made Based on Feedback

Based on the feedback from the mid-session evaluations, we made few changes. The majority of respondents generally indicated satisfaction with how the workshop was being conducted. In particular, the use of time seemed to work for most respondents. Most respondents (87.5%) had indicated that the use of face-to-face time worked for them, and 75% thought the use of asynchronous time was effective. A lesser but still significant number (62.5%) thought the amount of work time was sufficient, and we had almost an even number of respondents who thought there was too much or too little work time. The majority of respondents (87.5%) had responded that the pacing was enhancing their learning experience.

When we looked at the data, we realized that we had one learner who seemed already well versed with the material and for whom the pacing was slow. This is a perennial challenge with any workshop given to adult learners in that the distribution of skills can be quite broad, and accommodating both ends of the skill spectrum can be difficult. While we wanted to make the workshop a valuable experience for all learners, we decided not to make any significant changes to the pacing of the workshop or the allocation of types of activities. We did begin the following week by stressing that those who wanted more work time could choose to work independently and skip some of the presentations, especially if the content did not apply to the learner's project. We also reminded participants that work time could be used for one-on-one instruction or small group work in areas that were not being addressed in the schedule. In this way, we planned to accommodate the needs of all learners as best we could.

Final Workshop Evaluation Survey

The second survey was given at the end of the second week to measure participants' level of satisfaction with the workshop and how well the workshop met the objectives of the administration and the participants. The results of this evaluation were used to gauge the overall effectiveness of the workshop and guide changes for the next iteration.

Final Workshop Evaluation Questions

These questions were a combination of those asked as part of previous intensive workshop assessments and those asked in end-of-workshop evaluations for our shorter hands-on technical skills workshops.

For ease of reading and to help us categorize the responses, we grouped the survey response items into seven sections:

  1. overall satisfaction
  2. workshop as a whole
  3. the instructors
  4. workshop support staff
  5. workshop materials
  6. hands-on exercises
  7. feedback

Overall Satisfaction

The overall satisfaction section consisted of two questions:

  1. What was your overall level of satisfaction with this workshop?
  2. Would you recommend this workshop to others?

These two questions were meant to gain a quick look at general satisfaction with the workshop.

The Workshop As a Whole

Questions regarding the workshop itself were aimed at evaluating the content and its applicability to the learner's needs and goals. In this section, we also sought to measure the relative comfort of the learners with the workshop's methodology and their own confidence in their new skills. These questions were phrased as statements with response options ranging from "strongly disagree" to "strongly agree." More detail about this type of question and response format is given below. The items regarding the workshop were:

  1. had content that was appropriate for my background
  2. had content which was appropriately balanced (general and specific)
  3. met my needs and covered the topics I needed to know
  4. gave me enough time to practice
  5. developed an esprit de corps
  6. was flexible enough to allow me to pursue my goal
  7. had a good blend of instruction vs. independent work time
  8. increased my confidence in my ability to teach an online course
  9. had a good mix of technical and pedagogy instruction

There were a number of questions in this section that were of particular interest to us. Since we knew we were branching into an area of instruction where we would be challenging preconceptions and asking faculty members to consider new instructional models, we particularly wanted to develop an environment where faculty learners were comfortable and able to build an informal learning community for the duration of the workshop, if not longer. We also were stepping into a new area of instruction when we began to incorporate instruction on pedagogy as well as technical skills. We wanted, therefore, to make sure we were meeting the needs of the learners and not being perceived as overstepping our mandate, at least in the eyes of our workshop participants.

The Instructors

In any workshop, we ask about the learners' perceptions of the instructor(s). These questions came pretty much from our standard workshop evaluation and also followed the statement format:

  1. clearly presented the subject matter
  2. were knowledgeable about the subject
  3. were responsive to the participants
  4. used appropriate instructional methods
  5. paced the workshops well

Workshop Support Staff

Since we had a small number of workshop assistants, we asked questions regarding the perceived usefulness of these staff. Primarily, they were on hand during hands-on technical instruction periods as well as independent practice time to help when learners had questions during practice time. Specifically, we asked for response on two items about the helpers:

  1. came quickly to my aid
  2. knew how to help me on a given application

Workshop Materials

Another standard set of questions asked at the end of our workshops deals with the instructional materials. In this workshop, we used a wide variety of written materials, all available on the Web both during and after the workshop to the participants. While we solicited feedback during workshop instruction, we also wanted to give the learners a chance to let us know how they perceived the documents used via a couple of survey items. As with most other sections, survey items are in the form of a statement with responses indicating levels of agreement:

  1. were clearly written
  2. will be useful as reference documents

Hands-On Exercises

Since we offered a number of optional technical skills sessions, we also asked questions regarding the usefulness of the hands-on exercises, allowing respondents to indicate that this section was not applicable to them by checking "True" to the statement: "I did not participate in the hands-on exercises" and advancing to the next section of questions. If they did participate in any hands-on sessions, we asked them to respond to these statements in the same format of degrees of agreement regarding the hands-on exercises:

  1. were clear
  2. were at a level of difficulty appropriate for my background
  3. gave me enough practice to be able to work on my own
  4. alerted me to thinking how my students will learn how to use the new technology

Whenever we work with a variety of learners, we expect a range of technical understanding and abilities, so we find the responses to these types of questions valuable in gauging the general skill level of the participants and helping us adjust in the future.

The last question was written especially for this workshop. We intended to give our faculty learners a taste of the student experience with these Web-based instructional tools. This was in many cases necessary to give context before we taught our learners how to use the tools for instruction, but we also hoped that it would spark their imagination and allow them to extrapolate to other ways that their students could use technology for learning.

Feedback

Finally, we asked how we did in accepting and responding to feedback during the workshop. We think it is very important to listen to learners and respond to them during a workshop. And while we think we do a good job of this, it is important to check the perceptions of the people on the other side of the podium. So, we asked for response on two items using the same agreement with statement format:

  1. There was enough opportunity to give feedback.
  2. The feedback was acknowledged and changes were made.

We also included one last open-ended question ("Additional Comments:") to allow for any additional comments that participants wanted to make anonymously. While participants had ample time to provide feedback during the final debrief period on the last day of the workshop, this medium was the only place to give any last minute feedback without it being tied back to an individual.

Response Format

The survey items were all included on one online page and respondents were able to go back to review and change responses before submission of the survey.

For most of the questions in this survey, we asked for responses to items in the form of statements to which the respondent indicated their level of agreement or disagreement. We only asked one T/F question, regarding the applicability of a section of questions. Also, we asked questions regarding the level of satisfaction in a question format. Otherwise, we stayed with the statement agreement format.

For all but the T/F question, we used a uni-dimensional four-point response scale rather than a five-point Likert scale. We eliminated the middle response, usually marked "undecided" or "neither agree nor disagree" since we wanted the respondents to commit to a choice of response. If they truly did not have a response, they could leave the question blank.

Results

The survey was returned by 9 of the 13 workshop participants for a return rate of 69.2%. This was just slightly higher than the return rate of the mid-session survey.

The survey results are grouped into sections for ease in reading. The data and some statistical analysis are included in Appendix C. The items are numbered in the order they were presented in the survey.

Overall satisfaction

Overall satisfaction with the workshop was high. Respondents all indicated that they were "satisfied" or "very satisfied" with the workshop as a whole. Eight of the nine said that they would "probably" or "definitely" recommend this workshop to others. One would "possibly" recommend the workshop to others. No respondents indicated that they would not recommend the workshop. Therefore, while there were certainly items where the workshop could be improved, based on the feedback in the other sections of the evaluation, general satisfaction with the workshop indicated that we should consider offering it again.

The Workshop as a Whole

On the whole, the results concerning the workshop as a whole were positive. The survey items were all phrased as positive statements, and agreement was scored on a scale from one ("strongly disagree") to four ("strongly agree"). The mean result for each of the items in this section ranged from a low of 3.22 to a high of 3.66. Details regarding each item are summarized in Table 4.1:

Table 4.1. Assessment Responses on the Workshop as a Whole
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
Had content that was appropriate for my background 89% 11% 3.33
Had content which was appropriately balanced (general and specific) 100% 0% 3.33
Met my needs and covered the topics I needed to know 100% 0% 3.33
Gave me enough time to practice 89% 11% 3.22
Developed an esprit de corps 100% 0% 3.66
Was flexible enough to allow me to pursue my goal 100% 0% 3.55
Had a good blend of instruction vs. independent work time 78% 22% 3.22
Increased my confidence in my ability to teach an online course 89% 11% 3.22
Had a good mix of technical and pedagogy instruction 89% 11% 3.33

There were a few points where improvement might be made. We realized from the results of the mid-session evaluation that we had a workshop participant who was already quite comfortable with teaching online and for whom this workshop was not sufficiently challenging. This theme continued to emerge during the final workshop evaluation as well, and is a perpetual challenge when planning workshops since participants often self-select based on interest in the topic.

In addition, we see a slight need to consider how time is allocated. 11% (one respondent) did not have enough time to practice during the workshop. This was also noted in the free-form comments at the end of the survey. Also 22% (two respondents) did not think we had a good blend of instruction and independent work time.

There was some indication that the mix of technical vs. pedagogical instruction was not optimal (11% of respondents). Although, due to the nature of the response format, we do not know which should have been more heavily emphasized during the workshop. In future offerings of this workshop, we should split this question into one each regarding the sufficiency of technical and pedagogical instruction.

Finally, 11% (one respondent) did not feel more confident in his/her ability to teach an online course.

The Instructors

Workshop participants' perception of the workshop instructors was uniformly favorable. To all questions in this section, all respondents either answered "agree" or "strongly agree." Median values of responses for each question ranged from 3.44 to 3.77 (see Table 4.2).

Table 4.2. Assessment Responses on The Instructors
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
Clearly presented the subject matter 100% 0% 3.55
Were knowledgeable about the subject 100% 0% 3.55
Were responsive to the participants 100% 0% 3.55
Used appropriate instructional methods 100% 0% 3.77
Paced the workshops well 100% 0% 3.44

Workshop Support Staff

Attendees' perception of the workshop support staff was also entirely favorable. To both questions in this section, all respondents either answered "agree" or "strongly agree." The means of the values chosen for each question ranged from 3.5 to 3.6 (see Table 4.3).

Table 4.3. Assessment Responses on the Workshop Support Staff
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
Came quickly to my aid 100% 0% 3.5
Knew how to help me on a given application 100% 0% 3.6

Workshop materials

The responses to both questions about workshop materials were also positive. To all questions in this section, all respondents either answered "agree" or "strongly agree." Mean values of responses ranged were 3.5 for both questions (see Table 4.4).

Table 4.4. Assessment Responses on the Workshop Materials
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
Were clearly written 100% 0% 3.5
Will be useful as reference documents 100% 0% 3.5

Hands-on exercises

Reaction to the hands-on exercises was generally favorable. The mean result for each of the items in this section ranged from a low of 3.1 to a high of 3.6. One respondent indicated a need to improve the hands-on exercises to give more practice, which reinforces the responses calling for more practice in other sections (see Table 4.5).

Table 4.5. Assessment Responses on the Hands-on Exercises
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
Were clear 100% 0% 3.6
Were at a level of difficulty appropriate for my background 100% 0% 3.5
Gave me enough practice to be able to work on my own 89% 11% 3.1
Alerted me to thinking how my students will learn how to use the new technology 100% 0% 3.4

Feedback

Workshop participants seemed pleased with the opportunities to give feedback. To both items in this section, all respondents either answered "agree" or "strongly agree" (see Table 4.6).

Table 4.6. Assessment Responses on Feedback
Item % Agree or Strongly Agree % Disagree or Strongly Disagree Mean of Values Chosen
There was enough opportunity to give feedback 100% 0% 3.5
The feedback was acknowledged and changes were made 100% 0% 3.55

Final Comments

The complete list of comments can be found in Appendix C.

Both the quantitative data and comments indicated that time was an area that possibly needed improvement. One or two participants indicated in their responses above that they did not have enough time to practice, that the ratio of instruction to work time was not optimal, and that hands-on exercises did not give enough opportunity for practice. In addition, one respondent mentioned in the final comments: "I would have liked more time to work."

At the same time, we have some indication that the workshop as a whole was too long. "I think six days would have been adequate. It was a good blend of instruction, small groups, and work time but could have moved at a faster pace and been condensed into a shorter time frame."

Other respondents indicated that specific sections could have been shortened:

Others seem to indicate that the amount of time spent in the workshop was "two weeks ... very well spent."

Looking at the final evaluation as a whole, responses to the workshop were very favorable. There were no areas that suggested significant improvement before the next presentation of the workshop, and only a few where one or two respondents (approximately 11% in all cases) indicated changes could be made.

Workshop Debriefing

On the final afternoon of the workshop, we held a short debriefing session. This followed our intensive technology workshop format and gave participants a chance to give informal feedback in a group setting, celebrate their successes, and do some brief action planning.

This debriefing took place in a room different than the lab used for instruction. In this way, we began participants' transition from the intensive learning situation back to a more usual setting. This was intended to help them shift their focus and transition from learners back to independent work as faculty members in the company of their colleagues.

We had the Director of Information Technology Systems and Services, who is a member of the faculty, lead the debriefing session rather than one of the instructors. Again, this helped them make the transition from the mindset of being learners to their roles as faculty members. We also intended that this would encourage helpful critique of the workshop rather than a rehash of what went well. We had a very good rapport with most of our faculty members and wonder sometimes if they would feel less comfortable giving suggestions for improvement if we lead the debriefing session.

A short set of questions was asked during the debriefing. Generally, these questions prompt a great deal of sharing and discussion with little need for prompting. They are:

  1. What was the best thing you learned during the workshop?
  2. What was your greatest accomplishment?
  3. How was the balance between pedagogy and tool instruction?
  4. What was least useful?
  5. What do we need more of?
  6. What will you do next?
  7. How can you get help after the workshop?

The workshop debriefing information was intended to complement the surveys and reports by giving us information about what was useful in the workshop and should be kept or expanded upon during revision for its next offering as well as what could be changed or eliminated. We hoped to find some common themes between the two feedback mechanisms to help validate the information received via either one. We also anticipated hearing new information because of the informal and collaborative nature of the debriefing. Respondents, we expected , would be reminded of their experiences as others shared theirs and be encouraged to voice their opinions and consider different viewpoints and needs.

We also wanted the learners to begin to consider how they would continue with their projects and get help where needed. One of our biggest concerns was that daily priorities and pressures would derail attempts to finish courses without some planning for the transition from focused and supported work time to the daily life of a faculty member.

Results

The full set of responses to the debriefing questions can be found in Appendix D.

Workshop Features to Keep or Expand

As we look over the items in the lists "Best Thing Learned," "Greatest Accomplishment," and "Balance Between Pedagogy and Tool Instruction," we see a nearly equal distribution between topics or skills that were taught as part of a module and those that were learned in a one-on-one or independent work session. The valued topics/skills included in a workshop module were:

The most valued items that were learned outside of the formal presentations were:

That workshop participants found value in both the formal presentations and informal learning reinforces our commitment to keeping independent work time as a significant portion of this and similar intensive workshops. This allows participants to learn items of particular value to themselves and their projects without requiring that we expand the workshop to include these topics.

There were a few topics mentioned during the debriefing that could be expanded upon: eGradebook and WebCT. These were two tools briefly mentioned during an overview of course management tools. At the time, they were both new to our campus. Both, however, are gaining users on our campus and could be considered for inclusion in a future offering.

Finally, we found it encouraging to hear from the group that our experiment of allowing workshop participants to experience the tools of online instruction as learners was valuable. Three items were recorded indicating that at least some of the workshop participants found the experience valuable:

Since we had not seen that approach mentioned in the literature, we were not sure if it would work or what the reception to the method would be. Since no one indicated through any of our feedback mechanisms that this approach did not work for them, but rather several found it beneficial, we will plan to make use of it in this and other workshops.

Room for Improvement

While feedback in this session was largely favorable, as it was in the two surveys, there were items where participants felt we could improve the workshop. This primarily consisted of items that we could add. Fortunately, all of them were resources that could be gathered and added to the workshop resources section. Hence, they could be mentioned briefly as part of existing presentations and linked from the workshop Web site without adding significant amounts of time to the workshop presentations. These items were:

Also, some workshop participants thought that the number of faculty guest presenters could be reduced. While those presentations were seen to be useful, some learners thought we had too many and that some of their examples were so extensive as to be overwhelming. On the other hand, other participants found the range of examples to be useful. This mix of responses to the presentations of their peers was anticipated and will be discussed further in the Recommendations Section.

Final Reports and Projects

All participants were expected to provide a single-page report describing the impact of the workshop on their teaching? to the Director of Information Technology who shared it with administrators and the course instructors. This report was entirely open-ended, and did not need to answer any specific questions.

Each participant was also expected to develop their workshop project into a course with a significant online component, probably fitting into one of the four categories:

All of the workshop participants completed the final report. Of the participants, 2 of the 13 asked not to participate in this study. The remaining 11 participants' reports and projects were then examined to answer a few key questions:

  1. Did the participant complete the workshop project?
  2. If the participant did not complete the project, is there any indication of why?
  3. What type of model does the completed project most closely fit?
  4. What aspects of the workshop continued to be most valuable in completion of the project or in other aspects of the learner's work life?

Completion of Workshop Project

Of the 11 reports examined, 10 completed their projects within three months of the workshop conclusion. The remaining workshop participant did not complete the project because the course was reassigned to another instructor.

Project Models

The distribution of the nine completed projects is found in the following Table 4.7. We sorted projects into the four categories presented during the workshop and added a fifth to describe the result seen in three projects: a course taught with the same in-person requirements for a traditional presentation of the course but with the use of computer-based tools to mediate in-class or out-of-class learning and course management activities.

Table 4.7. Project Model Distribution
Instructional Model
Number of Project Courses
Cohort Individualized Instruction
0
Hybrid
4
Individualized Instruction
2
Learning Community
0
Technology Enhanced Face-to-Face
4

Despite the newness of the model, a major portion (40%) of the completed projects were done as hybrid courses in which part of the face-to-face meeting time originally scheduled was curtailed and online activities were introduced. A similar portion (40%) of the completed projects were done as technology enhanced face-to-face courses, which is a model similar to the hybrid without the lessening of scheduled meeting times. Finally, two courses (20%) were done as online individualized instruction, in which students work independently and do not need to interact with other students during the course.

Two models were not adopted by any workshop participants: the learning community and the cohort individualized instruction model.

Most Valuable Workshop Aspects

These workshop aspects were items that participants named as being valuable even after three months had elapsed from the end of the workshop. Each item is listed below with the number of reports in which it appears:

We see trends common in evaluations of our other intensive "camps." Consistently, one of the highest rated aspects of the technology training camps is the amount of scheduled independent work time. While it certainly does add to the length of the workshop overall, the amount of immediate practice time with tech support nearby is highly valued by a majority of the participants.

Another trend we see in the camp training format is the value participants place on interaction with and learning from their peers. A number of the highly rated aspects fit into this category: seeing how other instructors teach online, networking with other participants, the guest speakers, and sharing ideas about teaching and learning. Many faculty members have told us that they do not have much opportunity in their busy schedules to interact with their peers, and these learning sessions are one of the few avenues they have to network outside their own department.

Fortunately, the overview of technology also received high marks in hindsight. 55% of the reports examined mentioned this feature in general terms, and 27% wrote that experiencing the tools as part of the instruction media had been useful to them.

Summary of Findings

Overall, the workshop was perceived as valuable to the participants and met not only their needs but the goals of our administration as well. Ratings from the two surveys and the debriefing indicated a high level of satisfaction with the workshop, and of 11 study participants, 10 successfully completed their projects. The one participant who did not complete the intended project did not do so for reasons unrelated to the workshop or the participant's online teaching skills.

Copyright 2005 Barbara Z. Johnson and Bruce D. Reeves