Lessons My Students Taught Me

An Honest and Forthright Assessment of the 2022-2023 School Year

Detailed, evidence-based feedback is one of the pillars of our ungrading approach, called the Learning Progression Model (LPM). Because I tried a lot of new things this year, I need to examine whatever evidence I have so I can see what to keep, what to ditch, and what to alter. What lessons did my students teach me?

Time for the Autopsy

Being a analytical sort of person, much of the evidence will be quantitative. This means using numerical scores and talking about grades a lot more than I am comfortable doing. While this provides some cognitive dissonance, it is familiar ground upon which we can, perhaps, evaluate effectiveness of the LPM approach. 

Here’s the plan: first a brief recap of the changes I made this year, then I’ll dig into the results in detail. After that, I can describe possible next steps. 

Brief Background

If you are reading this at all, you probably know at least parts of my journey. Eight years ago, with my colleague Dave, I shifted to a form of standards-based grading. (Throughout this post, when I say “we”, I mean Dave and myself.) This morphed into our own ungraded approach, the Learning Progression Model (LPM). This entails using 10 practices to assess student achievement, with grades tied to target levels. The target levels move from foundational to sophisticated on each of the 10 practices, and as long as students are meeting the published targets, they will earn their desired A. As physics teachers, our 10 practices are tied to the Next Generation Science Standards, specifically the Science and Engineering Practices. We treat the content of the course as the medium that we use to focus on transferable skills such as problem solving, creating explanations, and graph interpretation.

Over the years, we have refined the learning progressions for each practice and have gotten to a point where we are comfortable with the process of ungrading. However, we work in a traditional grades school, constrained by the student information system that we are forced to use (Genesis). While we desire to get rid of grades completely, we cannot. In addition, we are the only teachers doing this in our school.  As a matter of fact, it has been extremely difficult to fight the culture of grades. Given this setting, we put many structures in place to support students (and parents) in making the shift, such as posting policies, sending emails, setting goals, and spiraling the curriculum. 

The Changes

Given that we have been doing this for 8 years, you might think that we’ve nailed it down. Nope. We are always reevaluating, tweaking, and refining. This school year, we made several changes, with many happening mid-stride, in order to address specific issues that arose with this year’s students.

  1. More Formal Conferencing. One-on-one and small group discussions have always been part of the LPM, but this year I restructured the timing and frequency of them. There were 3 mandatory meetings: October, February, and May. The first one was just a Q&A, get-to-know-you session. The second was a more formal reflective practice to discuss the results of semester one and set goals for semester two. The last was at the end of the penultimate unit to address end-of-year goals. A fourth, end-of-year conference was required for a few at-risk students but optional for everyone else.
  2. New Grade Translation.  In January, we shifted to a much more intuitive grade translation that better reflected the significance of each unit’s target levels. In addition to being easy to implement and explain, it can be used for all classes. We no longer need different charts to determine the grade for each course. It’s also easy to use at any point in the school year, as opposed to only at semesters’ end.
  3. Grading Contracts. We implemented a grading contract at the May conference. The hope was to engage student agency, ramp up motivation, and provide clarity. More on that later.
  4. Revised Learning Progressions for the project. I became very dissatisfied with the Engineering Design Process and Engaging with Content learning progressions over the course of the school year. Scoring the projects became more and more of a checklist, rewarding compliance not creativity. It also was becoming apparent that we just don’t do enough projects to enable students to get sufficient feedback to improve. Going into the last project, I made several big changes in those two learning progressions. 
  5. New Culminating Unit. I took on the challenge of integrating climate standards into my course. This was both a lot of fun and overwhelming. Teaching about solar power is fantastic because it pulls together electric circuits, energy, motion, motors, and forces. This is perfect at the end of the year because it provides opportunities for complex and interdisciplinary responses, allowing students to respond at the Expert level if they are ready. 
  6. Final Exam. I gave my last test during final exam sessions. There were the typical test questions to assess the 4 practices (LP5-LP8). However, I also included a lab question, so that students could have a last attempt at the 4 lab practices (LP1-LP4). This meant that the test took anywhere from 2 to 3 hours. It was unreasonably long for the students, in my opinion. In addition, even though I attempted to assess the concepts introduced in the new unit fairly (see #5 above), I acknowledge that the questions were not all pedagogically fantastic.

I implemented each of those changes thoughtfully, and often with excitement, writing about them in previous blog posts.  I LOVE the new grade translation, the ease at which it communicates the requirements and the universality of it. The conferencing was wonderful, and I felt more connected to the kids than I ever have before. The grading contract was an attempt to address the student passivity and disengagement, and was an interesting new feature. The tweaks to the language of the project’s learning progressions were meant to make the projects more fun, refocusing on the process of engineering design. Each of the 6 items above was fully thought out, if not perfectly implemented. 

So did any of this work? How do I measure success?

The Results

While I do not post grades on the SIS frequently, I keep track of each student’s achievement on an Excel spreadsheet. I use this spreadsheet to make decisions about target levels for the subsequent unit, pinpoint students who need support, and have a permanent record of every student’s performance across the year. I have been doing this since 2019, but I’m not including that year because Covid shutdowns happened in March 2020. So, I have the results for the past 3 years: 2020, 2021, and 2022. 

In Excel, I record student results quantitatively: Not Enough Evidence (NEE) = 0, Beginning = 1, Developing = 2, Proficient = 3, Advanced = 4, and Expert = 5. This enables me to more easily analyze the data, looking for trends by averaging and making graphs. I compare student results from unit to unit to make sure that they are progressing; I can get a better understanding if the class is ready to move forward; I can compare one unit in one year to that same unit in a previous year. I do all of these things to keep myself grounded in facts, not feelings.

I will discuss the data in several ways:

  • Class progress over the course of this year
  • Comparison at the end of the year from 2020-2022.
  • Individual student progress across the year

Note that I teach two levels of physics classes: Honors and Regular. While I will primarily focus on regular physics, there are some interesting patterns that occur when comparing the two classes. 

Overall Class Trends

Looking at the graph below, the y-axis has each of the practices. Each bar represents a unit, with the top being unit 1 and then each subsequent bar being the next unit. If you look at the first set, LP1 Experimental Design, students earned approximately 1.5 in the first unit, or Developing level. They improved a lot in Unit 2 and significantly in Unit 3. They plateaued at about 2.5 from Unit 3 through Unit 6. 

Physics Achievement across all 10 Practices 2022-2023

Let’s examine LP6, Problem Solving. In the first unit, students earned almost a 2.5 which means that about half the class was earning proficient at the start of the year. They continued to improve up through Unit 4, when they regressed almost back to the start of the year. 

Compare that to the previous year:

Physics Achievement across all 10 Practices 2021-2022

LP1 shows pretty similar trends (albeit slightly higher at every point) but LP6 shows much bigger improvements, especially at the end of the year. Why was there so much more improvement last year? It might be helpful to look at this more precisely. This table shows the average scores at the end of the school year for the past 3 years, for each learning progression.

Physics Achievement: Year-to-Year comparison

You can see from the data that there is a steep decrease in nearly every learning progression year-over-year, in regular Physics. Look at it graphically:

Year-to-Year score comparison (Physics)

Eight of the 10 practices show decreases in achievement since last year, but there is a decline across the board since the 2020-2021 cohort. What’s going on here? Before I answer this question. Let’s look at the Honors Physics class.

Honors Physics Achievement: Year-to-Year comparison

As you might expect, their overall performance is much higher than that of the regular physics class. The target levels are Advanced (4) vs. Proficient (3), because the class moves at a faster pace. In addition, there are fewer areas of decreased performance compared to the previous year. This is easier to see in the graph.

Year-to-Year score comparison (Honors Physics)

There are only 3 practices that show a decrease since the previous year, although there are very few that show improvements since the 2020-2021 cohort. The question is: Why? Primarily, I need to know: Is it something about the changes that students have gone through or something about the changes that I have made in my class over the last 3 years?

My gut instinct is to lay blame at the feet of the pandemic. Here is why:

  1. It’s all over the news: falling scores and low performance nationwide, post-pandemic.
  2. Anecdotal accounts from my colleagues: we have been having school-wide conversations about the lack of student agency and performance.
  3. The changes that I have made in my course have made the course easier to navigate and far more accessible. If anything, I have become MORE flexible, accommodating individual needs regularly. 

So, this data is not going to be helpful in analyzing the effectiveness of the LPM, in my opinion. The overwhelming and ongoing effects of the pandemic are still playing out.  If this is true, then the worst is yet to come, as those students who were in middle school in 2020-2021 are just coming up to junior year in 2023-2024.

Case Studies

I had 68 students this year. While looking at class results can provide some interesting directional trends, there are limits to the usefulness of the class data.  How about looking at individual students? There seem to be 5 general categories of students.

  • Those who are engaged fully and unflaggingly all year long can be divided into 2 groups.
    • Student #1: Students who mastered the process although struggled with some of the concepts (ending up at a B+).
    • Student #2: Students who mastered both the process and the concepts (ending up with an A). 
  • The students who did just enough to make it through.
    • Student #3: Oten the most challenging due to behavior or attitude, they were the ones whose guidance counselors and parents had to be roped in to support them in passing with a C-
  • In between these two extremes are the “average” students.  Some may try to use study skills from middle school that don’t work, despite coaching. They may have poor work or study habits. They may not be very motivated. They may find the content extremely challenging. For some, it’s all of the above. Often they were less consistent, sometimes really trying, doing all the practice, and coming for help but often dropping the ball completely. Some would acknowledge their general lack of motivation and discipline. 
      • Student #4: Those that eventually took action (ending up with a B-).  
      • Student #5: Those who didn’t (ending up with a C+). 

While every student has his/her own story, I think it is useful to look at a representative student in each of these groups to see if there are lessons to be learned.

Student #1: Brianna

During the first half of the year, Brianna was one of those students who caught on to the process (how we did the work) quickly.  Our winter conference was eye-opening, so much so that I wrote about Brianna in some detail. After the midyear however, her preparation for class took a marked turn, and she stalled in most areas. If you look closely, with 3 exceptions (LP3, LP6 and LP10), she did not improve much, if at all, since Unit 3.

Brianna's Progress 2022-2023

In early March, after students completed their “Unit 4 Progress Report and Goal Setting,” I had them send an email to their parents. I had two primary reasons for doing this. First, I was hoping parents would read an email from their child that they might ignore from me. Second, I was very concerned about what I was seeing in students’ (lack of) performance; I wanted them to overtly acknowledge where they were and take control. During this email writing activity, Brianna became very upset about the interpretation of her scores.  

Because she scored Not Enough Evidence on LP9, she was earning an F. She thought this was “ridiculous,” and let me know it!  We met several times to discuss this, and gradually set the stage for how to move forward.  She was able to gain some traction over the next two units, bringing her grade closer to target levels. She was able to demonstrate that she understood some of the processes of science even though she clearly had not mastered the approach across the board. This  is fairly typical of a B+ student in a regular physics class.

Student #2: Sofia

The students who ended up with A’s show several shared attributes: consistent engagement with the assignments, frequent requests for one-on-one help,  high quality communication, and an appreciation for the opportunities that LPM provided. Most of these students would get A’s no matter what systems you put in place; they are diligent, attentive, and motivated.  Sofia is a perfect example of this: a grade-conscious student who had been in science classes with support for the previous two years of high school, earning A+’s. She was not challenged, and had liked it that way. Her guidance counselor encouraged her to move up to regular Physics in her junior year. So she came into the class worried that it would be “too hard” for her. In addition, she had significant distractions going on at home with her family. These would have derailed almost anyone else, but besides an acute period when she was absent, she kept herself on track. Her scores over the year look like this:

Sofia's Progress 2022-2023

If you look at each column, you can see a fairly steady progression. The target levels were Developing (2’s) with some Proficient (3’s) by the end of Unit 3, and Proficient with some Advanced (4’s) by Unit 6. Even during the time period of her most acute troubles (Unit 4), she managed to make progress on most practices. This is a student for whom LPM worked wonders, relieving her of some of the anxiety of periodic setbacks. Because grades were not averaged, she had leeway to fail without any permanent effects. In the end, she was able to communicate what she knew and could do, showing huge improvements since September. The most challenging aspect of working with Sofia was keeping her calm so that she could focus on learning. Once we figured that out, she was on board and provided ample evidence of progress. 

Student #3: Luke

Like Brianna, I wrote about Luke after the mid-year conferences. He remained a challenge all year, seeming to be disinterested in either establishing a relationship with me or learning the skills and content of the course.  No matter what I did, I could not get him to see me for help, ask questions, work with others, or address any of the practices. As you can see in the table below, he earned Not Enough Evidence frequently throughout the year. (I feel the need to remind my readers that, to earn NEE, you have to omit the most basic requirements of the skill, the one(s) that we have been building on since September.)

Luke's Progress 2022-2023

By the end of the year, with support from his mother and guidance counselor, he produced enough evidence that he acquired at least some minimum level of skills and knowledge from the class, with the result of a C+. This was never a sure thing, and I practically held my breath while compiling his scores. In traditional-grades class, Luke would have been unable to pass the class at all, probably tanking his average as early as March. This is where the LPM really shows its true colors, offering repeated chances to students who, for whatever reason, feel disconnected. 

Student #4: Laura

Laura was always infinitely polite, easy to talk to, and cheerful. She was great at following through and managing her time. However, in most areas, her scores really stalled during the second half of the year. Earning Proficient on 6 out of 10 practices in January, she then seemed to flounder over the next 5 months. Looking at the table below, you can see in Units 4, 5, and 6 how she toggled mostly between Developing (2’s) and Proficient (3’s) with little consistency.

Was it because the content became more difficult, her focus wavered, or she had personal distractions? It is my feeling that she didn’t fully grasp the physics concepts and this resulted in a lack of confidence that stopped her from using the skills. The idea with the LPM is that Beginning and Developing levels are content-agnostic. Starting with Proficient levels, the process has to be correct, and the student should be able to state correct physics content even if it is applied incorrectly. If the student is not able to articulate the Physics accurately, then they will remain at the Developing level. This is different from how I graded before using LPM. Correct work earned a certain amount of points and incorrect work lost a certain amount of points. If the answer wasn’t correct, students didn’t do well, period. Many students have a difficult time adjusting to the idea that as long as they communicate the correct process, they can still do well, even with mistakes.  Laura did not master how to present clear reasoning, whether because she didn’t understand how or didn’t know the material.  When I cannot follow a student’s thinking, they have missed the key to earning Proficient. This happened to the next student as well.

Student #5: Eleanor

Eleanor, on the surface, seemed like a good student: friendly, quiet, and diligent. However, this masked a passive and perfunctory approach to learning. When she asked questions, it was clear that she did not know the foundational material, and this compounded over the year. Looking at her scores, you can see that she rarely earned Proficient, mostly toggling between Beginning and Developing levels.

Interestingly, she wrote me a long email after grades were posted in June. The message was that the grading policy is wrong, she worked so hard, and everyone feels this way. There was more fire in that one email than I saw in her all year! She was somehow shocked by the fact that she earned a C+, despite evidence that she was working at the Beginning levels all year long. 

This type of student is more common this year than in previous years. While I am generalizing, there were about a dozen students who believe they don’t deserve the C+ that they earned. I spent a lot of time examining their work to make sure that it was in fact an accurate reflection of their achievements, and I believe that it is. 

What about the Contracts?

A quick recap: at the beginning of May, I conferenced with every student individually. We filled out a contract. In the contract, there was a list of 8-12 items (negotiated between the student and me) that the student needed to complete in order to earn a minimum grade. We mostly used their scores from the unit we had just finished, on Energy, along with the grade translation chart. I say mostly because we also looked back at their general trend. That’s part of the negotiation process. If they could provide evidence of earlier, consistent achievement, then I was happy to use that. I told them, “As long as you do the things listed on your contract, then I promise you that you cannot do worse than this grade in Unit 6.”  I am calling that contracted grade “the floor”.

The 8 – 12 items covered things that that individual student needed to focus on in order to do well. Of course, if they did these things, they should do better with or without the contract. The point was to reduce stress going into the last unit, and to encourage them to improve without fear of failure. Note that we did not set any ceiling. All students were invited to earn the highest possible grade, an A+, which is defined as exceeding target levels on any 3 practices, and meeting the target levels on the remaining 7.

Many students rose to the challenge. They worked consistently and kept track of their responsibilities as expected. Those who “contracted” to come for help once each week did so. Those who needed to practice problem solving, they did so. Of course, some did the bare minimum. Of course. But many, if not most, understood what I was providing for them and engaged in the spirit in which it was offered.

In Physics, 52% of 44 students met or exceeded their floor using the scores from the last assessments. Of the 34% (15) who earned scores below their contracted floor, 60% (9) of them did not keep their contract. Only 6 of them were able to use the contracted lowest grade because they did what they said they would do.

In contrast, 75% of 24 Honors Physics students met or exceeded their floor using the last assessments and did not need to use their contract. Of the 25% (6) who earned scores below their contracted floor, only 1 student did not keep their contract. Five of them were able to use the contracted lowest grade because they did what they said they would do. 

This is not surprising. For better or worse, good study habits, attention to detail, and compliance are all reinforced characteristics that separate Honors level from regular level. The Honors students are highly motivated, often with better executive functioning skills. In Dave’s Conceptual Physics classes, which have in-class support, it is often the case that students are disinterested and uninvested. Dave has been masterful in negotiating with students like this. But those characteristics seem to be trickling into regular physics now, unlike in previous years.

Overall Grades

Grades. It is unfortunate that we have to reduce an entire year of work down to a symbol. All that students have done, all the beautiful and unique variety of their strengths and weaknesses, distilled to a single letter. I know what this letter means. But does anyone else? I wrestle with it and have come to no firm conclusions. I still waffle. Let me share my thoughts and you can give me your opinion.

The table below shows grades that students earned based only on their end-of-year assessments.

Percent of Students Earning Grades after Modification of Target Levels

The class averages were both at a high C, with no As. This seemed unacceptable to me. In this day, in this place, and in this atmosphere, parents, students and colleges do not interpret C as average, nor A as exceptional. Call it grade inflation, but it is real. A B/B- is average, with only an A+ being exceptional. I needed to examine the results to see why the distribution came out this way.

Modifications

After much ruminating, I modified the target levels for each class. I had good reasons for it.

Let’s tackle Physics first. The target levels were Proficient for seven of the ten practices. Only Arguing a Scientific Claim, Creating Scientific Explanations, and Problem Solving were set at Advanced levels. I decided to reduce two of those three back to Proficient levels, using whichever helped them the most. 

My rationale was that the last unit was very challenging, we had little time to practice, and my inexperience with the content was to blame for the flaws in the last unit’s design. I thought long and hard about just dropping whatever 2 practices helped them the most, but simply could not justify lowering target levels to Developing. We have been working on Developing levels since the beginning of the year, and I was not okay with zero growth in any particular area. However, I could justify not being at Advanced level for a regular class… that seems fair enough.  This change helped almost every student jump up at least half of one grade, if not more.

In the Honors class, there were many fewer problems. As is typical for this level, most of them worked really hard and talked with me continuously. I’m not going to say that some of the work ethic and integrity issues haven’t trickled into this level, but it is much less common. That said, initially, there were 13 of 24 grades below an 80, which is highly unusual. And that’s with the target levels already lower than last year when 3 were set at Expert. This year, the target levels were set to Advanced on all ten practices.

I made three modifications to the Honors targets.  

  1. I lowered the Engineering Design Process (LP10) to Proficient to accommodate the recent changes I had made. Students uniformly missed some features of the Advanced level which I attribute to the use of the newly altered learning progression.
  2. The Graph Interpretation question (LP7) that I put on the last test was inadvertently unfair. In every practice leading up to the last test, we practiced using the mathematical model. However, I gave them a graph where there was no trendline at all and they had to use the area under the curve. After scoring, I could see how that would seem out-of-the-blue and really threw them for a loop. I lowered that target to Proficient as well.
  3. Using the same rationale described for regular, I lowered one additional practice from Advanced to Proficient, whatever helped them the most.

With these well-thought-out adjustments, these were the new grades.

Percent of Students Earning Grades after Modification of Target Levels

When I looked at each individual student’s grade, I now agreed with most of them holistically. Their final grade reflected their level of mastery. For students in the C+ range, I combed through their evidence, which included not only the final test and the final project, but their contract and their tracking log. If I saw any evidence that they were able to perform at a higher level consistently, then I shifted the score to reflect that performance.

The biggest problems occurred for students who did not keep their contract. This was much more of an issue for the regular Physics class. Why? I don’t have an answer to that, just many hypotheses. Is it because the students seem to think, “I’ll just show up and it’ll all work out”? Is it due to the pandemic closures? Is it due to the influence of social media? Answering these questions will take someone looking at this from a higher perspective than myself, I think. All I see are my 68 students in my small corner of New Jersey. The trends in my classes, given my content, my experience, and my approach to learning are my only frame of reference. 

The Meaning of Grades in My Classes

For an ungraded approach, I have spent A LOT of time referencing grades in this post. This provides some cognitive dissonance for sure, but it is the environment in which I must work. That said, what do grades mean to me? 

Grades are tied to how well students met the target levels for all 10 of the practices. In general:

  • A = met all targets
  • B = one level below target
  • C = two levels below target
  • D = three levels below target
  • F = not enough evidence on one or more targets

I am fully aware that Dave and I made this up. Grades are always subjective. But this is the scale that we created and communicated. It is arguably better than most, but novel. The point is that grades do have meaning for me, as they probably do for you. I can describe each letter grade and what I would expect from a student who earned that letter grade.  Are there exceptions? Of course! But in order to have integrity, I must be sure that I am consistent with my stated criteria.

The question that arose for me is why some students did not take action to improve in areas in which they were clearly struggling and which would have a significant impact on their grade. I intend to look into this more deeply in another post.

My Action Plan

If I can recognize challenges, I can create possible solutions. Here are some feasible approaches to address some of the issues I have identified above.

  • End-of-Unit Grades. I am not giving up on the ungraded approach. I think that the LPM works. However, I have been thinking that perhaps, given that Physics class represents only 1/8th of one year out of 11, perhaps we are asking for too steep a learning curve. Just like we start our practices at the Beginning level, and gradually work up to Advanced, I should do the same with the reporting. Students and parents need an on-ramp. So I am thinking that I will post grades at the end of each unit, instead of only at the semesters’ endpoints. This is the Beginning level of ungrading reporting. I hate the thought of grades driving the bus, but the reality is that without them, students and parents think that they have no information. And until my school district moves to an SIS that is more conducive to reporting competencies… I feel like I need to adjust my reporting to the tools I am given.
  • Expanded use of the Grading Contracts. Frustrated with the lack of agency from students, Dave and I made a list of approximately 10 actions that we know would increase student performance. For example, “Complete 90+% of labs with improvements based on feedback and/or exemplars” and “Do necessary practice outside of class time (things suggested during our conversations).” The idea is to hand this out to students who state they want an A in the course and tell them that as long as they do these things, we guarantee they will earn it. I like the idea, because it outlines requirements that we have always had, such as minimum number of labs to complete and how much homework to do. However, I am not 100% sold on its possible efficacy.
  • Curricular tools. I’ve got several new things on my plate:  I am once again teaching AP Physics 1; I’m mentoring a new teacher; my supervisor invited me to take part in “the exploration of the newly released OpenSciEd High School curriculum units (Bio, Chem, & Physics) & piloting 1 unit”. In addition, I have committed to using gotLearning, a platform that streamlines student feedback. I am always one of the first to volunteer and pilot new approaches. I am always looking to improve the experience in my classroom.  I LOVE learning new things, but making the associated assessments and practice is time consuming and difficult. Adding in one thing usually means that something else pops out. I am not going to lie, sometimes I wish that I could be one of those teachers who did the same thing every year on the same day. But I’m not. Never have been. But I wish it nonetheless.
  • Integrating study techniques. I need to spend more class time overtly teaching students a variety of learning strategies and metacognitive skills. This is especially relevant with regard to vocabulary and terminology. But the imperative is more driving with the recent and rapid availability of AI.  While I know that it will eliminate other activities, students desperately need to learn how to study.
  • The Final Exam. I was very unhappy with the length and difficulty of the last test. I want the final exam to be a last attempt to improve on all standards, period. So I need to allocate time to give the last unit test and last lab before final exams. How I implement this will be determined later, but I have several ideas.

This summer I will be planning for the next school year, trying to incorporate these changes (and more) into the curriculum and classroom routines. I’m sure that I will write all about it as it unfolds.

The Future of Ungrading

Ungrading improves student performance and learning, in my opinion. My analysis of grades here seems to only highlight the inconsistencies and inadequacies of reducing performance to a single letter. However, I am confident that students do better with this approach than otherwise.

Out of curiosity, I analyzed the data from each of the students in the case studies above. Translating them into “traditional” grades by making Beginning = 1, Developing = 2, etc., I averaged the scores on the last assessments from each unit to give a grade at the end of the unit. Then, using a GPA scale where 4 is an A, I translated the averages into a semester grade and a year grade. And you know what… they earned worse grades than they earned using LPM.

What If We Used Traditional Grading?

What does this prove? Not much. As stated before, the assignment of a grade is arbitrary; whatever the grading system is, it just needs to be communicated clearly. The LPM does not penalize for lower performance levels in the first half of the year, acknowledging that it is expected to learn gradually over time. The problem occurs when students stop improving. Some students seem to wrongly assume that they can simply continue to do what they were doing and this will earn them the same result. It doesn’t work this way; evidence of growth is mandatory. Some students never seemed to grasp that concept.

Conclusion

I initially had a very strong emotional reaction to student results at the end of the year. My multiple early drafts of this blog post had a lot of intense feelings. As time passed, I was able to look at this more dispassionately. I may write in more detail about the sentiments at another time, because I do not think I am unique in this. For now I will say that the deeply personal, messy, and fraught field in which I work fills me with ambiguous feelings, from one end of the spectrum to the other.

As I have said before, I love that teaching has a beginning, middle and end to each year. I have received a lot of feedback from students, both directly and indirectly, both positive and negative. Now I get to use that information to adjust, adapt, and try again. I have learned many lessons this year! I will return to another season, rested, motivated and as prepared as I can be. 

Please feel free to share your experiences and offer your suggestions in the comments section below!