DIFFERENTIATED OBSERVATION 2021
LISA KAROSAS
Student Participation & Feedback
Students were required to complete one practice set of their choice, weekly. The practice set was programmed in the LMS to appear as "due" each Monday morning and "close" the following Monday at 11:59 pm, providing the students a full week to complete the assignment. This routine was altered twice in the second semester, when I assigned specific practice sets related to content being studied in class; students were assigned these practice sets for completion on asynchronous schedule days but retained access for a full week as per usual.
The following is a graphical depiction of student participation for all weeks through the end of the third marking period:
On average over the 22 weeks during which Weekly Practice items were assigned, approximately 80% of the class list (~105-110 students) submitted the correct evidence summarizing their effort.
I was aware that students might need some time to become comfortable with this new and very different routine of visiting a third-party website to complete required assignments and use a software tool to collect evidence of their work. Indeed, there were many occasions on which practice snapshots were missing a title. In these cases, I could not discern which content the student had practiced. Other times, snapshots were completed as desired but reflected content that did not originate from the approved list provided in Agora's LMS. In these cases, the student had accessed the practice item from within CK12.org, choosing from a vast list of possibilities. Here, their practice was irrelevant and did not support their ongoing understanding of standards as reflected in our curriculum.
For the first five weeks, it became customary for me to flag the assignment with a half-credit score, 5/10, and attach a note indicating what was wrong with the submission. Students were always given the opportunity, even encouraged, to secure the correct snapshot and resubmit to me directly via email. During those first five weeks, nearly every student did fix their errors as indicated by the lack of green data in the figure, above. However, a small portion chose not to make these changes to receive full credit from week six to week nine.
INCOMPLETE WORK AND THE
"PRESENT-NOT-PARTICIPATING" POPULATION
Upon first considering this data at the end of the first semester, reflecting upon fifteen practice assignments, I assumed the almost consistent 20% of students not completing Weekly Practice reflected the "present-not-participating" portion of my class list. However, as I collected more data throughout the third and start of the fourth marking period, I was surprised that this was not entirely the case.
​
Part of my in-class instructional strategy involves the daily completion of investigative or problem-solving-centric classwork. Each day, I collect and review student responses to classroom challenges and keep an accurate record in my own personal notes which I primarily use to record "Participation & Engagement" scores for each student in the gradebook. I also use this information to group students in my course for assessment-review purposes; by assigning the same test to different groups of students, I'm able to generate assessment data for each group. Doing so helps me better understand outcomes and make future plans. Specifically, I create and use three groups throughout the school year:
​
Present and Active, "PA"
These students complete classwork approximately 3 of 5 days each week
​
Present Not Participating, "PNP"
These students might attend class regularly, be chronically absent, or chronically tardy. Generally, these students complete classwork 2 or less days each week.
​
Asynchronous, "ASYNCH"
These students work asynchronously, completing daily classwork and submitting it to a dropbox rather than on the whiteboard during live sessions.
The table of data and the accompanying graph, below, show the participation-based classification data I collected throughout the 2020-2021 school year through April 25, 2020. Test data can be ignored for the purpose of this discussion.
As most teachers in our virtual school would predict, the percentage of "PNP" students grew from September through April, with the greatest lack of engagement occurring later in the academic year. Collectively, the "PNP" population among students in my chemistry classes grew from 7% in September to a whopping 48% in April.
This data does not suggest that "PNP" students ignored Weekly Practice assignments! Perhaps the original 7% that exhibited no participation in classwork during live sessions also did not complete Weekly Practice assignments. However, as that population grew, the percentage of student completing Weekly Practice remained constant. Specifically, from December through April, 10%-20% of the class list stopped participating in live session classwork but continued completing Weekly Practice assignments. Since one of the reasons I incorporated this practice was to uplift the mindsets and overall scores of students struggling with the challenging content inherent to the coursework, this data is encouraging. Perhaps, though students might have become increasingly uninterested or discouraged by the nature or difficulty of the content, their ability to choose less challenging or more interesting practice items enabled them to earn overall scores that did not prevent them from passing the course.
Such a conclusion might be further supported by graphs in which each student's final calculated grade is plotted against the subtotal of the Weekly Practice score over the grading period examined. Here, individual student participation is indicated by the blue points and final calculated scores for semester 1 and quarter 3, respectively, are indicated by the yellow points.
In Quarter 3, an increase in the number of blue points on the 100% line indicate that more students submitted all of the Weekly Practice items assigned.
One could suggest that more individual students made a more focused effort to complete these assignments during Quarter 3 than in past marking periods in an effort to keep their overall grade from dropping considerably. Incidentally, it is during this marking period that the chemistry content dramatically changes into math content and students become the most discouraged.
SOLICITED STUDENT FEEDBACK
As part of regular professional reflection, I issue a survey to students at the end of each quarter. The survey prompts students to consider all aspects of their experience in the course ranging from my knowledge and interaction with them through course materials provided and routine classroom practices. This year, I issued the same survey over all three marking periods.
When asked about their opinion and perspective on the Weekly Practice requirement, over half (62%) of 202 total student responses collected throughout the year classified it as a "great feature" that "helped ...so much".
​
Additionally, in the field requiring an open-ended answer to the question, "If there is one aspect of this course you'd be upset to see removed, what would it be?", 26% of 202 total student responses collected throughout the year wrote in some form of "weekly practice".