By Mark Heintz


I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:

  • How do I know if my students know? 
  • How do I get them to know if they know?  

Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.

Week Sixteen: Answer the Question

This week the content focus was primarily on the Columbian Exchange and the global flow of silver.  The primary focus was the 1450-1750 time period.  Here were the standards for this unit:

  1. List five results of the Columbian Exchange.  
  2. List three effects of global flow of silver.
This week’s skill focus was aimed towards improving students’ ability on the stimulus multiple choice and the document based question.

  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a claim based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.

Cite Specific Evidence

How do I know the students have learned?

The students took two mini quizzes this week. They were more traditional in nature, as in they were on paper and “graded” by the me, the teacher.   One was a short content objective quiz on the content objectives above.  The students averaged 80% on the content assessment.  The second quiz was a stimulus based assessment.  The students average 70% on the stimulus assessment.  Here is a sample question from the stimulus based exam.

How do I know the students learned and how do I know if they know what they were supposed to learn?

This question is still kicking my butt.  I try each week to know how they know.  This week I took a suggestion from my student about using Schoology to have each student evaluate writing samples. To do this, I first had each student write one part of the DBQ essay in a discussion post.  After they submitted their samples, I took several of the writing samples posted by the students and created a Schoology quiz.  In the quiz, I took one of the samples and for each of answer choices was an item from the rubric.  The student’s selected each answer choice they felt the writing sample earned.  In the sample below the student felt only four of the points were earned.

Explain the Reasoning 

How do I know the students have learned? They consistently perform well on the content tests.  The students requested more frequent content assessments to hold them accountable. Since they requested more assessments, I gave them more assessments! Their performance was at the same level as it was on the last unit exam when there were more objectives being tested.  I am not sure what to make of that data point.  Since they performed at about the same level, should I keep doing the more frequent assessments or hold out until the end of the unit?

What the more frequent assessments did reveal was that there were a few gaps in their content knowledge.  The students need instruction on the economic systems involved in the Columbian Exchange and the role certain empires played.

There was some good news to using more frequent content assessments.  There were fewer outliers.  On the last content test, there were a few extreme cases of students “bombing” the test.  On this assessment, there were not such cases.  There was much more of a middle norm, which would be a case that the more frequent assessments may help in preserving the positive narrative of the course.

How do I know the students learned and how do I know if they know what they were supposed to learn? Using the Schoology quiz to assess student writing was amazing! First, it held all students to submitting a written response and then evaluating specific responses.  In the past, I walked around the room and the student typically write in pairs or I cannot get to every response.

The new use of the Schoology quiz enabled me to have all the students record theirs.  I wanted to know if they knew what excellent writing was. Second, the quiz revealed to me, the teacher, that they do in fact know what excellent contextualization writing is.  Almost every student was extremely close to accurately evaluating their peer’s writing sample.   I cannot wait to use this method again!

As for the stimulus exam, the student scored at the same level as they did on the previous unit exam.  But this time the students spent a day going through the test with explanation of the answers.  I have written a detailed response for as to why each of the answer choices were right or wrong.  Having the students spend the day going through the test was fantastic.  Hearing the dialogue and conversations centered around the questions revealed that the students read most of questions too quickly. Even though they averaged about the same, they are understanding what they don’t know and why they are getting questions wrong. There is still a lot of work to go in this arena.  

Reflection and Impact

As I stated last week, I need to get student’s input more frequently.  They have given me such great ideas to improve student learning.  I used two of the suggestions this week and I know they helped.  I going to use the Schoology quiz more frequently to have the student evaluate student work.  It is so quick and simple to do and it reveals if students know what is expected of them.  Such a powerful instructional tool.  I cannot wait to use again!

Read week seventeen here.

%d bloggers like this: