top of page

GettingItDone

Student Participation
Marketing
Project Timeline
Participant Feedback
Tutor Reflection

Going from “plan” to “practice” had its challenges!  Over a period of four months, we introduced our program to the students, solicited their feedback, collected attendance data, and tweaked the action plan in response to our observations.

Project Timeline

Our program was not implemented at the start of the academic year.  Rather, we first offered this supplemental resource during the last four weeks of the first semester, early January 2017, which coincided with the period of time in which students are preparing for their midterm exams.  The timing undoubtedly helped us “kick off” the program; we initially advertised the sessions as a means of reviewing and studying for the semester test in chemistry.

 

At that time, however, a tutor training course was not yet in place.  The volunteer tutors were merely identified and approved based on the degree of their live session engagement and their overall performance in their chemistry coursework.  The complete tutor training course was made available at the beginning of Quarter 3; successful completion of the training course was required to continue as a tutor in Quarter 4.

 

Throughout Quarters 3 and 4, we modified the schedule based on attendance data each week.  Additionally, tutors met with the teachers weekly to discuss upcoming content and anticipated student needs.

Marketing

Students were made aware of this program through the use of our conventional communication tools in cyber school: 

 

  • autodialer calls to their home

  • announcements on the chemistry course webpages

  • emails to their school accounts

  • shout-outs for upcoming sessions during live sessions

The autodialer was used during only the month of January, as students were preparing to finish the semester. 

All other forms of notification continued throughout the remainder of the year. 

In an effort to really spark interest, some special videos were prepared for embedding into our course announcements, sharing during live sessions, and emailing out to students:

Prepared to Introduce the program

Prepared To Stir Interest Halfway through

Student Participation

We collected attendance data throughout the entire length of the program. 

Our raw data can be found here

To evaluate the effectiveness of our efforts and suitability of the plan, we reviewed the following specific data:

  • Overall, Monthly Attendance

  • Attendance Per Session Schedule

  • Repeat Attendance

  • Attendance According to Teacher Assignments

Clearly, our best turnout occurred during the initial introduction of our program which coincided with the end of Semester 1.  Though we "sold" these sessions as ideal to assist in reviewing for the midterm exam, we are aware that our virtual learning model provides ample flexibility in time for turning in assignments.  During these three-weeks, we expect that many students who participated may have also been reintroducing themselves to the content in units for which they may have had overdue assignments.

At the beginning of February, we observed an obvious decline in per-session attendance which corresponded to the end of Semester 1 after report cards had been issued.   Nonetheless, participation was still relatively strong when we consider the entire month. 

In March, our tutors attended to, largely, sit idle.  Overall, monthly attendance in March was less than half that of January. 

 

Knowing that a number of schedule-disrupting events were upcoming in April (spring break and PSSAs), we temporarily postponed the program for that month and resumed on May 2nd.  Despite the impending end of year preparations we had expected students would begin making, participation was lowest during the month of May.

overall, monthly attendance

attendance per session schedule

Our initial plan provided for multiple opportunities during which students could take advantage of this supplemental remediation.  We rationalized that some students could only participate during the school day, while others might be more inclined or available to attend in the evenings. 

In the first few weeks, some students did attend those evening sessions.  Early into February, however, students stopped logging into the room altogether.  We noticed this in real-time and did drop that session from our schedule for the remainder of February; we hoped that the decrease in “supply” would leverage “demand” during the day.

This session-specific attendance data did not provide any worthwhile insight into the “best” time to offer supplemental, voluntary sessions during the school day.  Largely, we recognize that student availability will depend on individual course schedules which are difficult for us, as teachers, to determine and navigate.  The tutors actually helped us better understand when most students would be attending specific required live sessions.

repeat attendance

Largely, students who visited the peer tutor room did not return for even a second visit, though many indicated in survey results that they would.  Only approximately 15% of our participants visited the room twice; far less visited enough times for their work to have a meaningful effect on their overall academic performance in the course.  The single student who visited twelve times reported a boost in her grade from a “D” to a “B” and was very vocal with other students about how helpful the sessions had been for her.  We wanted to track this type of achievement data, but the lack of repeat participants didn’t allow that analysis.

attendance according to teacher assignment

It is notable that each teacher, except Lisa, had a similar number of students participating at some point throughout the duration of the program.  Since all advertising content was shared and applied equally among the entire group of teachers, we cannot speculate on why her students seemed to participate to a greater degree than those assigned to other teachers.  Cindy and Sarah did not collaborate on this project, so, as a group, we have not discussed what other factors may have contributed to this effect.  When we do, it may be useful to compare individual policies and practices related to scheduling office hours or other one-on-one remediation sessions.

Participant Feedback

During the two week introduction of this program, we shared a survey at the end of each session so that we could quickly and easily make changes to the plan if feedback indicated a need to do so.  Somewhere between one-half and one-third of participants completed the surveys.  The raw data can be found here.

In addition to verifying our suspicion that Unit 4, Chemical Bonding, is the most challenging content unit in the scope and sequence of the course for Semester 1, this participant survey data was overwhelmingly positive with regard to first impressions:

  • The peer tutor sessions were implemented as communications had suggested they would be.

  • Tutors were prepared, qualified, and knowledgeable.

  • For those repeat students, the content with which they interacted was different from their first visit.

  • Nearly 95% of all participants indicated that they would return to peer tutoring for chemistry help in the future!  The remaining 5% believed they would recommend it to a friend.

Tutor Reflection

Of our eight volunteer tutors, four were enrolled in the Honors course and four were enrolled in the comprehensive course. 

 

Four responded to a brief survey regarding their retrospective thoughts on the program.  The raw data can be found here.

Again, results were overwhelmingly positive.  The tutors felt prepared and supported while learning a lot themselves!  They all agreed that the program, as implemented this year, would have been helpful if they were in need.

bottom of page