What Do Teacher Teams Do When They “Look at Data”?

What Do Teacher Teams Do When They “Look at Data”?


From the Marshall Memo #430

In this thoughtful American Journal of Education article, Judith Warren Little of the University of California/Berkeley says there is very little good research on what teachers actually do when they engage in data-based decision making. She suggests that researchers zoom in on the details of teachers’ work, using a “micro-process” lens to get a better sense of what works and what doesn’t work when teachers look at interim assessment data. 

Micro-process research has been used to see what goes on when a doctor conducts a standard medical interview of a patient. Close observation of the human dynamics has revealed that the doctor’s questions often focused on biomedical data, ignoring the patient’s efforts to introduce “lifeworld” information. “More specifically,” says Little, “an interview structure that privileged short answers to a physician’s questions about symptoms tended to silence, constrain, or interrupt longer patient narratives that might have served as a resource in medical diagnosis and treatment.”

In schools, micro-process studies have helped spotlight the dreary “I-R-E” (Initiate-Reply-Evaluate) classroom dynamic – the teacher asks a question, a student responds, the teacher says whether it’s right or wrong, and the cycle repeats itself. 

Little summarizes six studies that closely observed teachers as they worked with data. In the first (Earl, 2008), the researcher noticed (in Little’s words) “the tendency of teachers to turn away from the data in hand even when it is closely linked to the curriculum in use and to talk in more general terms about instruction or noninstructional factors in student performance, such as parental expectations.” This study pointed to the challenging and vital role of the principal “to sustain a focus on the data and on interpretations and implications that could be anchored specifically in those data.” Earl described a principal who repeatedly brought teachers back to making meaning of the evidence in front of them by asking: “What patterns do you think are meaningful? Are there any other patterns that you find? How do you feel about how the grade 1 students are doing? I want to go back to the data. You know what is really interesting to me. Look at this plum color. Let’s look at the data wall for grade 1. Would you take us through each child and tell us about them.”

The second study (Timberley, 2008) compared schools doing data analysis, some with good learning results and some producing no gains. In the effective schools, leaders set a clear purpose for data work, teachers met more often, there was student reading and writing data on the table, and the focus was on how specific teaching practices enhanced or inhibited student gains. In the unsuccessful schools, leaders gave only vague direction and data conversations lacked substance and clear instructional implications. Timberley says, “Rather than basing these conversations on information about student progress, they focused mostly on teaching practice… Less effective conversations became stuck in activity traps in which examining data and having conversations was seen as a good thing to do with only a vaguely defined purpose for doing so.” Timberley also noticed a difference in professional norms in the successful and unsuccessful schools. In the former, teachers reached out for help to instructional coaches and accepted suggestions to be observed and to observe each other’s classes. In the latter, teachers didn’t critically analyze different ideas, accepting a variety of suggestions as equally valid. 

The third paper (Little and Curry, 2008) analyzed transcripts of 40-minute “critical friends groups” in which teachers used a protocol to present and discuss evidence of student learning and effective teaching practices: describing the student work while refraining from judgment; interpreting the work; and considering implications for classroom practice. One of the key things Little and Curry noticed was how important it was that teachers understand what they were teaching – in this case, the genre of the persuasive essay. 

The fourth paper (Lasky, 2008) described teachers working on the “data wise” process and found that conversations tended to focus on procedures and process rather than the meaning of the student results. 

The fifth paper (Barrett, 2009) describes four “small learning communities” working with the Teacher Leadership Model to create a coherent curriculum and a data-driven system of accountability. Barrett did more than 50 observations of teacher teams over an 18-month period and was struck by the fact that teachers were more likely to speak up when they were engaged in “kid talk” – frequently superficial, laden with stereotypes, and focused on explanations for student failure outside teachers’ control. In Little’s words, “The presence of a facilitator and the availability of tools for displaying and reviewing data appeared to offer limited purchase on the tenor and direction of the discourse and especially on what appear to be deeply ingrained ways of classifying students according to perceived effort, motivation, and ability.” 

The final paper (Kazemi and Franke, 2004) followed eleven elementary teachers over a one-year period as they examined their students’ responses to agreed-upon mathematical tasks and activities. “Teachers’ inferences from these written records of student work – inferences about what students ‘must have been thinking’ – were challenged when the teachers started more systematically to elicit students’ verbal explanations of what they had done and how they were thinking and to report those classroom conversations alongside the work that students produced,” says Little. “Teachers’ understanding of mathematics teaching and learning deepened, and their classroom practices shifted, when they attended to the details of student thinking and problem-solving practice as those were revealed in a combination of student work samples and narrative accounts of classroom interaction.” 

What made this last study so rich and helpful was the use of audiotape to capture the details of teacher meetings, including:

  • Uncovering the gradual shift in teachers’ orientation to the specifics of student thinking;
  • Linking that shift to the change in the nature of classroom evidence considered by teachers;
  • Tracing the contributions of individual teachers to the group’s deliberations;
  • Identifying how the facilitator’s specific moves and interventions furthered teachers’ development.

Open-ended analysis of a small sample of student work on common instructional tasks proved more helpful than looking at all-class data; the latter tended to focus “on the correctness or incorrectness of student responses with little attention to evidence of the reasoning behind the response,” says Little. 

“Understanding Data Use Practice Among Teachers: The Contribution of Micro-Process Studies” by Judith Warren Little in American Journal of Education, February 2012 (Vol. 118, #2, p. 143-166), http://bit.ly/Hb2ktu 


Views: 262

JOIN SL 2.0

SUBSCRIBE TO

SCHOOL LEADERSHIP 2.0

School Leadership 2.0 is the premier virtual learning community for school leaders from around the globe.  Our community is a subscription based paid service ($19.95/year or only $1.99 per month for a trial membership)  which will provide school leaders with outstanding resources. Learn more about membership to this service by clicking one our links below.

 

Click HERE to subscribe as an individual.

 

Click HERE to learn about group membership (i.e. association, leadership teams)

__________________

CREATE AN EMPLOYER PROFILE AND GET JOB ALERTS AT 

SCHOOLLEADERSHIPJOBS.COM

FOLLOW SL 2.0

© 2024   Created by William Brennan and Michael Keany   Powered by

Badges  |  Report an Issue  |  Terms of Service