Caitlin C. Farrell, Julie A. Marsh and Melanie Bertrand
From “data chats” to “Data Fridays,” teachers are involving their students in looking at data. Does your approach increase student motivation—or decrease it?
On one “Data Friday,” Ms. Mendoza tries to encourage healthy competition by displaying her 7th period’s performance on the most recent interim assessment compared with that of her other classes. Down the hall, Mr. Williams passes out individual results on the assessment and asks students to take out their data binders to graph their own progress, reflect on their data, and determine action steps.
Like Ms. Mendoza and Mr. Williams, practitioners and policymakers around the country have expressed considerable enthusiasm for engaging students with data. Some see it as a way to encourage students to exert extra effort; others believe that students who look at their own data gain a better understanding of their strengths, their weaknesses, and how to improve.
How do teachers commonly examine data with their students? And what does research tell us about how these practices are likely to affect student motivation?
Motivating or Demotivating?
Motivation research identifies classroom practices and activities that shape students’ orientation toward goals (Dweck, 2010; Pintrich, 2003).
A performance orientation directs students’ attention to grades and achievement and encourages them to compare themselves with others (Ames, 1992; Pope, 2010). Performance-oriented goals are generally associated with negative student outcomes (Meece, Anderman, & Anderman, 2006). Although some students may be motivated by a performance orientation, others may balk at difficult tasks and give up when faced with difficulty (Pintrich, 2003).
Teachers promote a performance orientation when they make most decisions for students, reward achievement relative to others, use rewards to control behavior, provide boring or repetitive tasks, and divert attention from tasks and learning to achievement (Ames, 1992; Epstein, 1988).
In contrast, a mastery orientation, in which students focus on developing new skills and improving their competence, is associated with self-regulation, increased effort, autonomy, and the belief that effort will lead to academic success (Ames, 1992; Pintrich, 2003; Seifert, 2004). Teachers foster a mastery orientation when they focus on individual improvement, recognize and reward effort, evaluate students privately, involve students in decision making, foster students’ sense of responsibility and independence, provide meaningful and interesting learning activities, and encourage students to set short-term, self-referenced goals (Ames, 1992; Epstein, 1988).
A Study of Teacher Practices
To understand how teachers use data with their students, we conducted in-depth fieldwork in six middle schools in four districts in 2011–12, including interviews, focus groups, and observations with teachers, coaches, and school and district administrators (Marsh, Farrell, & Bertrand, 2014).
The teachers in our study, like most educators in the United States, faced increasing demands from school and district administrators to engage in data-driven practices. Their schools had data walls and regularly scheduled data chats focused on standardized test results. They received new technology to obtain quick results from assessments, generate color-coded displays, and assist with analyses. Teachers also had data coaches, instructional coaches, and professional learning communities to support this work. It was only a matter of time before the demand for data use extended to students.
We met many teachers who believed that having students analyze data would motivate them to learn, and we identified 50 instances where teachers engaged students with their data. Overall, we found that many teachers set up performance-oriented classrooms that may actually have been demotivating for students.
Data-Use Practices Supporting a Performance Orientation
Mrs. Landen, a 7th grade language arts teacher, frequently posted graphs of test results on her data wall following each common grade assessment or district benchmark assessment. These bar graphs compared the results across her classes, which were homogeneously leveled by students’ performance on the prior year’s state test. Mrs. Landen felt that students were motivated by competition:
Yes, it’s to motivate, to encourage my “basic” kids that they’re “this close” to honors, and then it’s to motivate my honors kids. I even lie to my honors kids and tell them, “You know, on this test our neighbor teacher’s honors class scored a certain percentage.”
The graphs themselves emphasized not only the state’s performance levels (with an emphasis on a proficiency goal), but also the direct comparison to others. Students were not involved in analyzing their data, and there was very little guidance on how they could improve or reengage with material to fill in gaps in their knowledge.
Mrs. Landen’s data use typified the performance orientation we saw in one-third of all instances in our study. These teachers
- Believed that if students saw their data, they would work harder and take assessments more seriously.
- Presented data as status-based information. For instance, one teacher posted the names of all students scoring proficient or advanced on district assessments on the classroom wall and had those students sign the list.
- Publicly shared group-level data or even individual results in the belief that social comparison motivated students.
- Used extrinsic rewards like prizes and parties when students had moved to a certain proficiency status to ensure student investment in assessment results.
- Provided limited opportunity for student involvement, instead showing prepared data displays and telling students how to interpret the information. Teachers provided little guidance about what students should study or revisit.
Data-Use Practices Supporting a Mastery Orientation
Ms. Santos, a veteran 8th grade language arts teacher, began the school year worried about the low performance of one of her classes. She worked closely with the school’s literacy coach to design routines to engage students in analyzing assessment data and setting goals. She provided students with copies of their multiple-choice answer sheets, and together they corrected the results. Students then privately analyzed how well they had mastered each standard, and they chose how they could close their own gaps in learning.
In one instance, Ms. Santos designed state standardized test “clinics” where student choice was central:
I have five different groups by strand, and I have a different activity for them to do. They chose which strand was their weakest and which one they wanted to work on for six weeks. … They bought into that since they were able to see their own strengths and weaknesses.
In small-group settings, Ms. Santos provided students with specific feedback based on each student’s needs. She repeated the cycle at each benchmark assessment, emphasizing students’ individual growth and effort as the key to improvement.
Ms. Santos’s practices, which were likely to promote a mastery orientation, were typical of one-third of the observations in the study. Teachers who used these practices
- Embraced a learning perspective—a belief that examining and reflecting on the data would help students identify weaknesses, what contributes to them, and how they could address gaps.
- Focused on growth-related information, articulating a clear relationship between effort and outcomes and encouraging students to consider their progress.
- Shared individual-level data privately with students in ways that focused student attention on how they were performing relative to their own past performance or how close they were to reaching standards.
- Sometimes used intangible rewards like praise and discussion of positive results to emphasize key messages about progress.
- Involved students in analysis, goal setting, and follow-up. For instance, students had opportunities to graph their own results and identify topics on which to focus their reflection.
- Were highly involved in supporting students’ next steps. In whole-group or individual interventions, teachers did not simply repeat the same content and approach but instead tried multiple ways to reteach the material.
Mixed Data-Use Practices
Finally, one-third of the instances of data-use practices we observed were mixed—that is, a hybrid of mastery and performance practices. For instance, in one school, classrooms were equipped with a “scan-cam,” a tool students used to immediately learn their individual scores on multiple-choice questions. A 7th grade teacher, Mr. Wilson, implemented this tool in such a way that students were actively involved in analyzing and identifying gaps. However, identifying a response to the results was teacher-driven, and it rested largely on test-taking skills rather than on helping students reengage with the content.
The environments in which teachers worked—characterized by school-level polices, routines, type of leadership, district-level expectations, and the broader accountability context—greatly shaped how they used data with students. In some cases, these factors pressed teachers to focus on performance; in others, they created opportunities for teachers to focus on mastery.
First, school-level policies and routines like data talks and data walls appeared to define acceptable student data-use practice for teachers. In one school with a high proportion of teachers who had a performance-oriented approach, a schoolwide policy required teachers to post comparative class data with a focus on proficiency goals to encourage competition between teachers and classes. Classroom structures then tended to replicate the components of the data-wall policy, such as attention to status and class comparisons.
Messages from school leaders—administrators, coaches, and teacher leaders—also framed the discourse around data use. In one school with the highest proportion of mastery-oriented observations, the assistant principal communicated mastery-focused messages to teachers:
I’ve tried to be really consistent in my message about how important goal setting is for kids, to teach them how to set goals, how to monitor their goals, how to assess their goal achievement and then recalibrate their goals based on how they’ve accessed them.
Several teachers at this school mentioned how the assistant principal’s vision helped them think about their own practices.
District policies and norms also shaped data-use practices. The literacy curriculum one district adopted fostered mastery-oriented practices with learning-focused, formative assessments, such as readers’ notebooks and conference logs. In contrast, the culture and norms in another district promoted competition among schools. During monthly meetings with principals, district administrators regularly reported and compared schools’ scores on the district benchmark assessments.
Finally, the broader national and state accountability environment in the United States promotes a performance-based orientation. Federal accountability policy (embodied in No Child Left Behind) has long emphasized status measures of student achievement and assumed that public reporting of information on performance, coupled with consequences, will motivate individuals to work harder to improve performance. It’s not surprising that many educators replicate the orientations promoted by the broader structures within which they operate.
Questions to Guide Practice
Educators are unlikely to have control over the broader state and federal accountability messages that shape data-use practices. But with an eye for what’s realistic and possible, we offer some questions that educators who want to engage students with data should ask themselves:
For teachers: What is your purpose in engaging your students with data, and how do you structure your classroom practices to meet these goals? What elements of your data-use practice are inadvertently emphasizing performance? How could you reorganize or refine your data-use practices to reflect a mastery orientation?
For school and district administrators: How are school or district policies and routines framing messages around data for your teachers that might translate to a mastery or performance orientation within classrooms? Do school or district policies and programs emphasize the values of performance, status, and extrinsic rewards; or do these policies recognize effort and growth?
As we work toward the laudable goal of involving students in data use, we want to make sure that our data-use practices support and motivate students, rather than deflate or demotivate them.
Authors’ note: Teachers’ names are pseudonyms.
Ames, C. (1992). Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology, 84(3), 261–271.
Dweck, C. S. (2010). Even geniuses work hard. Educational Leadership, 68(1), 16–20.
Epstein, J. L. (1988). Effective schools or effective students: Dealing with diversity. In R. Haskins and D. MacRae (Eds.), Policies for America’s public schools: Teachers, equity, and indicators (pp. 89–126). Norwood, NJ: Ablex.
Marsh, J. A., Farrell, C. C., & Bertrand, M. (2014). Trickle-down accountability: How middle school teachers engage students in data use. Educational Policy, 1–38.
Meece, J. L., Anderman, E. M., & Anderman, L. H. (2006). Classroom goal structure, student motivation, and academic achievement. Annual Review of Psychology, 57, 487–503.
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95(4), 667–686.
Pope, D. (2010). Beyond “doing school”: From “stressed-out” to “engaged in learning.” Education Canada, 50(1), 4–8.
Seifert, T. (2004). Understanding student motivation. Educational Research, 46(2), 137–149.
Caitlin C. Farrell is the director of the National Center for Research in Policy and Practice at the University of Colorado, Boulder. Julie A. Marsh is an associate professor at the Rossier School of Education, University of Southern California. Melanie Bertrand is an assistant professor in Mary Lou Fulton Teachers College, Arizona State University.
Click on keywords to see similar products:
Copyright © 2015 by ASCD
- For photocopy, electronic and online access, and republication requests, go to the Copyright Clearance Center. Enter the periodical title within the “Get Permission” search field.
- To translate this article, contact email@example.com
November 2015 | Volume 73 | Number 3
Doing Data Right Pages 16-21