top of page

Data Analysis

Interview

I chose to interview my students to see which strategies they were using to solve basic addition and subtraction facts. Students were asked their strategy based on the following five problems: 2+6, 9-3, 4+1+3, 20+16, and 19-12. Now there is no wrong way to find the answer, however I wanted my students to shift in their strategies to thinking mentally or using place value to solve. I wanted to use this qualitative data to understand each student's thinking process and help teach him or her the best strategy to use to solve the five different types of problems. Initially, my students were using their fingers to try and solve almost every problem. 

As my pre-test graph shows, students were using effective strategies, however, there were better ones to be taught and used. 91% of my students were using their fingers to solve 2+6. Likewise, only 10% of my students were recognizing to use place value to solve 20+16.

 

Again, many students had useful strategies such as counting on fingers, counting on/back, and mental math. It was simply a matter of teaching them which strategy to use that would be more effective. By the end of seven weeks of implementing my action plan, only a few students were still using fingers to solve 2+6. Only 22% of my students still needed that support, whereas the shift of mental math started taking place. Students could recognize they could start at 6 and add mentally 2 in their head to get the correct answer of 8. 

Once students had been exposed to place value, they were able to use it correctly to find the answer of 20+16. At the beginning, only 10% had said they would use place value to solve the two digit addition problem. Whereas, at the end, 78% of my students were using place value effectively to solve the problem.

What this data shows me is that my students' thought process shifted dramatically to use more effective strategies based on the given problem. My students were able to recognize the addition or subtraction problem and choose the strategy that would get them the correct answer in the most efficient way possible. 

Topic Tests

Since implementing my action research, I wanted to get data on my students' math topic unit tests before and after the seven weeks. This would give me insight if my differentiated math groups had any effect on learning. I averaged out my students' end of the unit tests to receive these scores.

What this bar graph shows is that my students had a low average of 83.3% on the four tests taken before I implemented my action research. I was teaching all whole group instruction during this time. I was not seeing my students apply the taught skills into their end of the unit tests. 

Upon researching, I was intrigued with guided math and setting up stations. During my research, I found how powerful it is to differentiate math instruction and connect it to prior knowledge. That is what I did when I met with my guided math groups. Instruction was different for each group and tailored to their level. 

At the end of seven weeks, my students averaged scores increased by 8%, to a 91.5% class average across the four tests take after implementation. I can conclude that my direct, differentiated instruction caused this increase in averages and understanding of the content.

Fact Checks

Weekly, I assessed my students' knowledge on the fact they were studying for that week. Students had flash cards they were using to study their fact, as well as using the cover-copy-compare strategy. I tracked my students' movement on their facts over the course of seven weeks. They needed a passing score of 17/20 to move onto the next fact or level. If they did not pass, they remained on the same fact the following week.

What I noticed is that 60% of my students passed every level during those seven weeks. Four students missed one level by a close score. 

Students 2, 11, 12, 14, and 22 all receive math interventions for 20 minutes every day. Within those 20 minutes, instruction on math facts is provided.

Additionally, students 5, 16, and 18 receive special education services. 

The strategies used throughout the seven weeks helped students to grow in their math fact fluency and number sense. 

This graph shows the weekly averages of students who either met or did not meet their fact check score of 17/20. I noticed that 82%-86% of my students were meeting their goal weekly. Those who were not meeting their goal were typically just a few away from meeting.

My class averages decreased as the weeks went on, however, I attribute this to the increasing difficulty level of the facts that the students were asked to complete. 

Overall, a majority of my students were experiencing success every week with moving levels. 

AIMSWeb

My students scored significantly higher in the spring on the AIMSWeb assessment. This assessment measured the students ability to answer single digit addition and subtraction facts in one minute. Since my students were exposed to basic addition and subtraction facts through the Math Workshop model, the computer program XtraMath, and their flash cards, my students understood fact fluency better than before my action research.

Before, only two students met the district benchmark score of 12. Now, fourteen of my twenty-two students met the spring benchmark of 16. Six of the eight students who didn't meet the spring benchmark, were either in special education for a math disability, or saw the math specialist everyday. Overall, I saw my students' confidence increase in fact fluency and they got quicker in answering basic addition and subtraction facts.

Questionnaire

I wanted to have a better understanding of my students' attitudes in math before I implemented stations so I had my students complete a questionnaire about the way they felt about math. I was teaching whole group instruction and 77% of my students enjoyed math, however by the post-test, 95% of my students said they enjoyed math time after the implementation of stations. I still had a few students who did not think math was fun. I found out that no matter what activity was planned, a few students just simply do not enjoy math. 

I noticed a 10% decrease with students wanting to do math by themselves. My students really enjoyed completing the hands-on activities with a partner. My students were engaged while completing math activities with another peer, therefore when they were independent, it was not as fun for some students. 

I wanted my students' nerves to decrease about timed tests. I did not have a time limit on the fact checks. Students were able to go at their own pace. Research says that math facts should not have a time limit because of the stress it can have on students. My students responded well not having a time limit.

bottom of page