Promising practices for assessing and adjusting your instruction to meet students' needs.
Data is often thought of as comprehensive spreadsheets consisting of numbers, graphs, and charts, representing scores from end of unit tests or standardized exams. It’s often analyzed to determine whether or not students have mastered content and skills, rather than inform instruction or translate into timely teacher moves in the classroom.
Quantitative data has its place; however, it alone does not suffice. In addition to charts and graphs, teachers need qualitative data to inform and adjust their instruction along the way — throughout the unit, and within particular lessons. So, what are some of the ways teachers can gather this kind of data and make use of it? What can it look like?
A portrait of practice
In a recent visit to a school in Georgia, a colleague and I had the opportunity to perform walkthroughs of select classrooms. One teacher I witnessed — a seasoned math teacher facilitating a lesson on solving equations with decimals — was doing a fantastic job of taking the pulse of her classroom and assessing the needs of her students throughout her lesson. I want to share what I observed as I think it can be a useful case study to help us answer the above questions. She posed a question for the do now, and after circulating to assess how her students were doing, she addressed the class: “Okay y’all, I want us to stop for a minute. I’m noticing that what is tripping us up with this problem is rounding, and I would hate for this small detail to result in us getting these types of problems wrong!” From there, she asked students to look back at their problem, particularly to see if they rounded correctly, while she prepared the next step of the lesson on her computer. After a few more minutes, she asked the students to go back to their seats, and informed them that they were going to engage in a Kahoot, to provide more practice with rounding. (Kahoot is a wonderful tool for not only offering practice, but also for gathering data quickly and accessibly. After each question, Kahoot offers a chart indicating how many students selected which answer and whether or not it was the right answer.) This was a simple and effective way to gather and use data in the moment, in order to shift the plan for the day’s instruction. Rather than push forward, she took stock of what was needed, and responded intentionally. And it didn’t stop there. As the lesson progressed, she continued to gather data while students were working, and made shifts based on what she observed. I watched her create a few different groups based on the information she had: one for students that needed more rounding practice; another group that focused on the original practice problems for the day; and another group that was pushed with some more challenging questions based on their strengths. This case study offers some promising practices for gathering and analyzing data, and making in the moment adjustments to instruction. In addition to the practices I described, I want to offer a few more that I utilized while I was leading my own classroom.
Turn and talks
Turn and talks are an effective means of assessment that I leaned on heavily during my time as a classroom teacher. Given my large class size, turn and talks allowed me to check for understanding with more students than I could if they were working independently. I often used turn and talks as part of a do now, where I would pose a question and then have students talk to a shoulder partner while I circulated and listened in on their conversations. Additionally, I liked to use turn and talks as part of a guided practice where I would model a strategy and then have students try it out with a partner while I listened and observed. I sometimes used a checklist to make note of which students seemed to be getting it and which students might need some more support, to inform how I might group my students for the lesson and inform who I might need to conference with individually.
Conferencing
Conferencing is another powerful formative assessment that can be very instructive for both teachers and students. Conferences, when executed effectively, involve looking at student work, asking some clarifying and/or probing questions to determine what a student needs, in the moment, as they practice a new skill. Based on this investigation, the teacher identifies a high-leverage strategy that can advance student learning, often models it, and then observes while students give it a try.
Collecting and sorting student work
Lastly, collecting and sorting student work is an effective means of assessment that can be particularly informative for sequencing instruction. As an elementary teacher, I would make it a point to collect student work once a week, whether it was students’ writing notebooks, their reading post-its, their drafts of writing, etc. I would look closely at the work to try and determine strengths and struggles, and then identify any common trends that could inform my grouping as well as the goals I should set for these groups. For example, if we were working on a writing unit focused on non-fiction essays, I might review student work and notice common challenges related to students supporting their thinking with evidence, using proper citations, analyzing the evidence to make connections to their claims, etc. I would sort the challenges, and attempt to narrow them down to three or four that would form my groups, and then identify a teaching point for each that I would implement the following week. It often felt like a lot of work, but when I did it, I always found it enlightening and I appreciated how it pushed me to ensure I was catering my instruction to what my students truly needed. (A twist of this for middle and high school teachers could be to collect and sort exit tickets, as they are likely more manageable than collecting drafts.)
As teachers, we need to understand and address our students' needs as they arise, as they engage in the learning process and acquire new skills. In doing so, we can reflect on and improve our instruction before it’s too late. What I hope I have provided are meaningful and manageable ways to gather qualitative data and make use of it in the moment and beyond.
How to recognize patterns in student performance as you take your next steps toward strategic instruction.
Analyzing data from high-stakes exams is:
Answer: D — All of the above While promising practices for using data to inform instruction are well intentioned, the process and impact often misses the mark. It can become an overwhelming and confusing experience that can pull educators into sinkholes that produce unreliable conclusions and eat up valuable time and resources. How can we yield the benefits of data analysis and avoid these drawbacks? Analyzing data is valuable because it helps us zoom out from individual results to recognize the patterns and trends in performance, so that we can make choices in the future that will benefit our students. But depending on the type of data we’re analyzing, and the purpose, how we approach the analysis and the conclusions we draw can change dramatically.
Where did the data come from?
Let’s start with the basics. When analyzing data, we want to be clear about where the data came from, and how it was produced. We can draw different conclusions and take different action steps if we’re analyzing a task we designed, or analyzing results from a national diagnostic. For example, when analyzing an in-class assessment, if the teacher realizes that most of their students missed question #3, they can look at question #3 and realize that it’s confusing and rewrite it for a future exam, or eliminate it from the students’ grades. However, if they’re analyzing question #3 from a state test, they have no control over the question or its wording and they can’t eliminate it from their students' grade. Knowing where the data comes from, who designed the task, how the task was scored, and the stakes connected to the data will help us determine our purpose for the analysis and the usefulness of the data.
When was the data collected?
Another key factor we want to be aware of is the time between when the data was collected and when it’s being analyzed. If the data has been collected and analyzed in real-time (within a few days or weeks of the assessment) the results of the data analysis may be immediately applied. This is most commonly seen after analyzing in-class formative assessments, exit tickets, or in-class tests or quizzes. Teachers can use the findings of their analysis to identify the needs that emerged and course correct for their students in real time. It’s not uncommon for data analysis to take place well after the assessment was completed. This is especially true for state tests, national diagnostics, or other formal assessments. When several months or more have passed, the data becomes more like an artifact from the past, rather than real-time information of what specific students know and can do. Artifacts can be extremely insightful and help us to see patterns and trends that might have been obscured at the time the assessment was taken. When looking at data collected in the past, we can use it as a snapshot of a specific point in time and consider what is the same and what has changed since the data was collected.
Whose data is it?
Next, we want to consider whose data we’re analyzing. Are we looking at current students in our class, who we’ll see in person within the next week? Are we looking at former students who’ve left our class and have moved on to their next learning experience? Are we looking at a larger picture of students we’ve never taught before and aren’t likely to encounter personally? When thinking about the “who” of data, we want to consider the students whose performance generated the data, who we’re teaching now, and how understanding the data will help us refine our practice for our current students, even if we never taught the students whose data we’re analyzing, and we never will. A helpful paradigm for this might be, data from... and teaching to…
Analyzing multiple choice data
Once we are grounded in the basics — when we understand where the data comes from, when it was collected, and who our instruction is targeted towards — we’ll have some direction and purpose for looking at multiple choice results. To make sense of the data, and to use the information strategically, we can consider our next steps based on the following scenarios:
75% or more students answered a question correctly
DATA FROM...
Whether it’s a spreadsheet of numbers or infographics that reflect the data in charts or other visual models, one of the first trends to examine emerges with questions that most students (75% or more) answered correctly. These questions help us to identify the key content or skills that are present in the curriculum, and were taught so effectively that most of the students in the cohort were able to answer correctly during the exam. CRITICAL QUESTIONS When we’re analyzing the data to inform our future curriculum mapping and instruction of students in the future, we want to reflect on where and how these concepts show up in our curriculum and put a star next to them. We may examine the instructional methods that were used here and see if we can expand these practices to other topics in the course. As we review these correct answers we can ask ourselves:
TEACHING TO... If we’re analyzing data from current students, in preparation for these students to take the same or similar exam again in the future, we’ll also want to take a close look at the students who got these questions wrong. Narrowing down that <25% of students who answered the questions incorrectly, when everyone else in the class answered correctly, helps us to identify students who are in need of an immediate intervention. These questions reveal that while everyone else was able to learn and apply the content taught in class on the test, this group of students continued to struggle. These concepts won’t be a good use of class time to review for all students, but with the data we can identify the specific students who will benefit from some increased support and reflection on their learning.
75% or more students answered a question incorrectly
DATA FROM...
After reviewing what most students answered correctly, we can then turn our attention to where most students answered incorrectly. When 75% or more of our students got the answers wrong, it does point to a potential gap in our curriculum or instructional methods. CRITICAL QUESTIONS As we review the incorrect answers, we can ask the same questions as before:
Our answers to these questions will reveal topics that perhaps we didn’t cover but needed to, or places where maybe our instruction was rushed or hurried and students didn’t have a memorable experience to take with them into the exam. When we analyze the data to inform our future curriculum mapping and instruction, these questions will help us better understand where we need to make revisions to the learning sequence, pacing, or focus in our future instruction. They may reveal instructional strategies that were less effective, or a change in the assessment expectations that can be translated into curriculum planning. TEACHING TO... When analyzing the data to inform current instruction for students who can retake the exam, these questions reveal the topics or skills that the whole class would benefit from reviewing or re-learning. More specifically, when we examine the specific answers the students gave (did everyone choose the same wrong answer? Did they choose different wrong answers? What does their response tell us about their misconception?), we can identify misconceptions and use that information to focus our instruction moving forward.
50/50 split between correct/incorrect answers
DATA FROM...
The third step for analyzing multiple choice data is to examine the questions that split our class into two groups. When around half of the class got the question correct, and the other half got the question incorrect, the question highlights content and skills that often mark the difference between students who are just barely passing or just barely failing. Since we see that at least half of the students answered the question correctly, we can have some assurance that this content was taught, but that not all students were able to internalize the concepts or recall them on the day of the test. CRITICAL QUESTIONS When we encounter these questions we can ask:
TEACHING TO... When analyzing the data for current students who have an opportunity to retake the assessment, it is useful for students to reflect on their responses and have another opportunity to resolve misconceptions. When analyzing the data for future students, these questions are triggers for content that needs more time, differentiation, or strategic instruction. These questions are key for seeing the tipping point between students who are meeting exam expectations and students who are close to doing so, but can’t quite make it yet.
When we take time to analyze student performance on an in-class assessment, state exam, or national diagnostic, we’re really taking the time to invest in our own learning. The more we can identify, recognize, and even predict the patterns and trends in student performance, the more we have to work with when we’re in the planning process.
Beyond simply helping us develop more effective curriculum maps and instructional methodology, data offers us the opportunity to use this information with current students who will be retaking the exam in the future, building a blueprint of concepts and skills they need to develop in order to meet their target goals. Examining the data from all three vantage points gives us the perspective we need to make strategic choices in the future.
A four-step process for jumpstarting your analysis.
Picture this: you just learned that you have access to your students’ test scores, or the standardized tests they took last year, and you are tasked with “analyzing the data.” You know your goal is to use those scores to inform your instruction, but what do you do? And where do you start?
Don’t panic.
Starting your analysis
As a doctoral student, I always appreciated understanding analysis in this way: “all analysis is basically sorting and lifting” (Ely, Vinz, Downing, and Anzul, p. 162). This visual of sorting and lifting helped me to take action, and not feel stuck or paralyzed by mounds of data that just sits there and does not analyze itself. Enter The 'Tions, a reflection tool that offers a pathway for making meaning out of a data set. With this resource, users can begin to navigate complex information they are charged with analyzing, by exploring four categories:
Using The 'Tions
While there is no single, prescribed way to use this tool, one recommendation is to begin in the top left quadrant — Confirmation — as a way to acknowledge one’s perceptions prior to diving into the data. This not only allows existing assumptions or hypotheses to surface before entering the data, but also nudges the user to begin thinking about the work and tap into prior knowledge. The Inspiration quadrant is a natural next step, as it encourages a fresh way to look at the data. If your data set is not particularly “inspiring,” consider identifying areas in the data that highlight strengths. As you make your way to the Revelation quadrant, don’t be surprised if this reveals similar information that you entered in the Confirmation section. This section will be most meaningful if you can remain open to taking a new perspective on the data at hand. Finally, use the Application quadrant to take into consideration your next steps after looking at the data. Consider how you will use your Inspirations and Revelations to inform practical next steps in your work. What’s a small step that you can take that would make the biggest difference?
Using The ‘Tions as a tool for reviewing data is a start, and can support the user as they begin to lift and sort the information before them. While The ‘Tions will not solve all of your data analysis issues, it will help you get unstuck and begin the important work of looking at data through specific lenses.
Set clear instructional expectations that help elicit students' most quality thinking.
There are many excellent reasons for looking closely at student work. We can look at the work to engage in inquiry around particular students and understand their strengths and struggles across classes and content areas, or we can look at the work of a particular class section in order to create strategic student grouping and inform differentiation for upcoming lessons. Sometimes, looking at student work can be a really meaningful way to reflect on our own instructional and curricular design — especially when we’ve recently created a new learning activity or assessment that we’ve never used before.
Recently, I supported teachers at one of our wonderful partner schools in student work analysis for this purpose. Like many schools across the city, rigor is a focus for professional learning this year. For this particular learning cycle, teachers worked on designing rigorous tasks using the Rigormeter, a resource designed by Dr. Roberta Lenger Kang, which re-envisions Bloom's Taxonomy. For this particular session, teachers were invited to bring two different pieces of student work: one that “met or exceeded expectations” for the task, and one that “is not yet meeting expectations.” Here is a snapshot of what they did with those pieces of student work, and how this reflection will inform next steps.
What does success look like?
For the first round of reflecting and sharing, teachers spent some time with the piece of work that met or exceeded expectations. Teachers were invited to engage in individual reflection using an iteration of our What / So What / Now What resource, which offers a process for making low-inference observations, analyzing findings, and identifying a course of action.
In this round of reflection, teachers had the opportunity to consider and articulate the success criteria for their rigorous task, and identify the skills and knowledge required for students to produce work that meets or exceeds expectations. Ideally, we’d like to have a clear vision of the success criteria before we ask students to engage in a task, but when we are trying something for the first time, this type of reflection might be necessary to gain further clarity. Let’s say a Social Studies and ENL co-teaching team is trying out a new assignment in which students are asked to plan an educational tour of Greece for the class. They are asked to choose at least three stops on the map and explain why a particular place or geographical feature is important for understanding the ancient culture of the civilization. A What / So What / Now What-style reflection on a successful student’s work might look like:
What are revisions and next steps?
For the second round of reflecting and sharing, teachers moved their attention to the piece of student work that was not yet meeting expectations. Once again, they engaged with a What/So What/Now What protocol, this time with slightly different prompts:
A sample reflection for that same scenario might be:
Insights and next steps
During our whole staff debrief, one common high-level insight that emerged across departments was that we cannot make assumptions about students’ knowledge and skills before assigning rigorous tasks. Thus, a next step in the group’s professional learning will be to explore the concept of formative assessments and their connections to engaging students in work that is appropriately challenging. The whirlwind pace of teachers’ work does not always allow a beat to critically reflect on planning; often after trying something new, we must take stock as quickly as possible and then shift our attention to the next lesson, activity, or assessment. However, when we can carve out space for intentional reflection (and school leaders and PD coaches support us in doing so), there are undeniable benefits — perhaps the most obvious being that we have much better insight into students’ immediate needs. But also, if we understand student work to be a mirror of our own practice — reflecting back ways to improve our own instructional design — looking at what students produce in live time informs how we will design our next learning activity, or how we revise our design for next time so that it elicits students’ most quality thinking.
A flexible path toward mastery that provides structured support for students at all levels.
When I was growing up, my high school Social Studies teacher had a poster hanging on the wall that read, “If you think you can, or you think you can’t, you’re right.” The message was clear, even to teenagers -- the power to succeed or to reach a new goal is often inside of each of us. As educators, we know that our students’ mindsets play a major role in how hard they try, how much confidence they develop, and how committed they are to reaching their goals. But confidence alone doesn’t get them to a point of mastery. And desire alone won’t develop their skills, or increase their knowledge base, or level up their accuracy or precision. For those changes, our students need structured support!
This structured support often comes in the form of scaffolding. Like the large platforms that help construction workers reach the tall exterior of a building, scaffolding student learning creates platforms of support as teachers incorporate challenging texts, complex tasks, and abstract ideas into their instruction. Scaffolding is critical when holding high expectations and implementing a rigorous curriculum — but scaffolding alone doesn’t develop independent learners. Sometimes, scaffolding can become a crutch that teachers and students use, turning a support into a shackle. As educators, we often spend a lot of our planning time thinking about how to build scaffolds to break learning down into manageable components, but we can’t stop there. We must also consider the ways we gradually release scaffolding so that students can internalize and transfer their knowledge and skills to new tasks and topics.
A path toward mastery
Our Progressive Scaffolding Framework outlines a path for educators to consider when setting high expectations for students, helping them find that balance between necessary supports and structured enabling. Building on the ideas of Zone of Proximal Development and apprenticeship theories, the framework outlines a path toward mastery in four stages:
Stage 1: I do, you watch
When introducing new content or skills, we begin with the I do, you watch stage. We initiate this by introducing new concepts alongside prior knowledge, real world examples, or previous units of study. Our goal is to map new information onto our students’ activated schemas so that the new content or skills are contextualized and relevant. At this stage of instruction, we can prepare and provide a model of the task, using a Think Aloud mini-lesson where we walk our students through an internal thinking process that illustrates how we navigate the task and make decisions. Alternatively, we can outline the explicit steps to complete the task, or provide a roundup of the important information students need to know before diving in. The I do, you watch process can be presented to students working individually or in small groups. It’s important to remember that even at this stage, students shouldn’t be sitting silently. We always want students actively engaged, so we might add a note taking component, a reflection task, a meta-cognitive class discussion, or an element of inquiry so that students remain intellectually engaged in the process.
Stage 2: I do, you help
After laying the groundwork for the task in stage 1, we can move into stage 2, where students begin working with the content and task materials with support. Working in small groups, students might replicate the model with new information, restate or reword the essential steps in their own words, or engage in a small group discussion or group practice as a way to begin experimenting with and internalizing the skills.
Stage 3: You do, I help
In stage 3, the content and skills should be familiar to students after their initial explorations, and they should be ready to continue in pairs or small groups with more independence. Students are still in the development phase of their learning, so they may need additional support and will benefit from frequent check-ins, and suggested strategies — but here’s where we want to avoid returning to stage 1 supports. We’re looking for students to be engaged in a productive struggle. Students may benefit from suggestions of “fix up” strategies or options for what to do if they get stuck. At this stage, we want to push students beyond replicating the model or the example by having them practice the skill or apply content with a new format, a new context, or by making connections to other topics within the discipline or beyond. This is also a great stage to ask students to use one another as resources. While working in pairs and small groups is an excellent way to support students at their level and create opportunities for growth through collaboration, we want to ensure a high level of individual accountability so that some students don’t take on the burden for the group while others opt out of the learning process.
Stage 4: You do, I watch
In stage 4, students have been exposed to new content and skills, they’ve practiced working on a task informally with support, and they’ve begun making connections with other content information or demonstrating their learning through class activities and tasks. At this stage, it’s important to begin removing any unnecessary scaffolds to see what students can do independently. In the You do, I watch phase, we recommend providing a short review of the process and previous work done up to this point in the learning experience. After the review, we can be clear with students that they’re ready to try it out on their own. Provide a clear task and an adequate amount of time to complete the task (3-4 times as long as it would take you to do it). Students who are able to take on this challenge and demonstrate their skills individually prove that they’re meeting the expectations of the task and are ready to move forward to the next knowledge block or skill sets. Students who struggle at this stage help us to understand where and why they’re struggling, so that we can return to Stage 3 to provide targeted support.
How long does this take?
Like an accordion, this process can be expanded or compressed to meet the needs of your grade level and subject area. We might be able to move through the four stages within a single lesson, or it may be an expanded process that is organized across a week’s worth of lesson plans. Consider these two examples:
45-minute Lesson Plan Structure
5 minutes | Opening warm up: Inquiry question 10 minutes | I do, you watch: Mini-lesson modeling 10 minutes | I do, you help: Stop and jot, turn and talk reflection on the model 15 minutes | You do, I help: Small group practice 5 minutes | You do, I watch: Closing summary formative assessment
Week-long Lesson Structure
Monday | I do, you watch: Introduction, modeling, and reflection Tuesday | I do, you help: Small group discussion and practice Wednesday | You do, I help: Small group practice and connections, part 1 Thursday | You do, I help: Small group practice and connections, part 2 Friday | You do, I watch: Independent practice and formative assessment
The process of instruction and assessment is complex, especially when we’re trying to use data to inform instruction and support students who’ve struggled in the past. We want to be mindful to keep forward momentum toward rigorous learning goals while developing a clear path forward for students who begin at every level.
Principal Candace Hugee weighs in on the power of quantitative and qualitative data.
In my experiences as a classroom teacher, district level administrator, and as a professional development coach, I constantly struggle with the negative connotation often assigned to data. This is especially true in cases where educators see the term "data-driven instruction" as being synonymous with high-stakes testing. As my colleague G. Faith Little notes in Understanding Data: How Does It All Add Up?, data is not just a tool for evaluation — it’s a source of information.
The meaning of data
There are several major components of data-driven instruction. Understanding not only what they are, but what they mean is important when considering data points and the intended outcome of improving instruction.
A principal's perspective
with Candace Hugee In Data-Driven Instruction, authors Ben Fenton and Mark Murphy note that “in this era of increased accountability, nearly every principal has begun using data to help drive instructional practices. Principals in the most rapidly improving schools almost always cite data-driven instruction as one of the most important practices contributing to their success. But what exactly does data-driven instruction mean to them, and how do they achieve it?” I decided to take that question and others to Candage Hugee, Principal at the Urban Assembly School for Collaborative Healthcare. We have been working together for nearly three years, and I have found her experiences and application of data for her school to be most instructive.
Current studies indicate that educators in schools with data-focused programs think using data improves their instruction significantly. Very often, these schools have a tendency to gather various forms of data, because they recognize that all forms of data are valuable information. The more information we have, the more informed our decisions can be, and the better our instruction will be for our students.
Observe, infer, and take action on a problem of practice using three simple prompts.
What are we noticing? So, what does it mean for teaching and learning? Now what should happen next? These are some of the questions posed by one of our favorite resources — What, So What, Now What — which leans on our core values of critical reflection and cycles of inquiry.
Developed by Gene Thompson-Grove in 2004 and revised 2012, this protocol allows you to do several things at once: gather information, analyze and interpret a problem of practice, and envision next steps for your work. This is a versatile protocol that can be modified to support teachers, leaders, and even students as they work to understand curricular content.
Jumpstart your reflection
What, So What, Now What can help you to evaluate a recent experience, untangle a problem of practice, or inspect quantitative or qualitative data. After observing and analyzing what you already know, you can then work toward identifying the next steps for your practice.
This resource works in three phases: Understanding the event (What?)
Making sense of the facts and implications (So what?)
Identifying a course of action or new solutions (Now what?)
Engage students in inquiry
From here, the class can begin to share highlights from their charts, and begin to draw conclusions about the lesson.
What, So What, Now What is a highly adaptable tool that can promote curiosity, reflection, and accountability. Its flexibility allows for application with all members within a school community, and we encourage you to adapt it to best meet your needs.
How are you using this resource? Let us know in the comments!
Observing student behavior and communication to inform our instruction and create meaningful learning opportunities.
Productive struggle, a term that has gained popularity over the last decade, is found in instruction that “stretches students’ thinking and performance just beyond the level they can do on their own — the zone of proximal development.”
Finding and teaching to each of our students’ sweet spots is no easy feat. As teachers, we are often guilty of over-scaffolding, or rescuing our students out of fear or our desire to avoid student discomfort. Alternatively, we sometimes push students too far by introducing a task that is well beyond their level, without also providing the necessary tools or assistance they need to meet the challenge. In order to support students in productive struggle, we need to be patient, persistent, and committed in gathering and using data in our instruction. This can include quantitative data such as reading levels and test scores, or perhaps more importantly, student actions and behaviors — what we know as qualitative data. How can we gather qualitative data about our students in order to support them in finding a meaningful, productive level of struggle in their learning?
Identifying zones of struggle
Before we can locate promising practices for our instruction, we first need to identify look fors and listen fors that can help us determine if and when students are being pushed too far, or are not being pushed enough. When students are not being challenged enough or instruction is below their level, we consider this a level of no struggle. At the other end of the spectrum is destructive struggle, in which tasks are too challenging for particular students, or are significantly above their level. A zone of productive struggle lies between the two.
Let’s identify the look fors and listen fors for each of these zones:
When students are in a no struggle zone, it might look like and sound like:
When it comes to the destructive struggle zone, we might observe students who:
In contrast, when students are in a zone of productive struggle, we are likely to see them:
Observing student behavior online
Given that so much instruction is happening remotely, and we may not be able to observe the same behaviors from our students in a remote setting, we can also identify look fors and listen fors when online teaching and learning is taking place. In a no struggle zone, we might observe online students who are:
When students are in a place of destructive struggle online, they might:
In contrast, productive struggle might look and sound like:
Determining students’ current zones of struggle is a helpful starting point for potential shifts in your instruction. Different students at different times might exhibit these behaviors, and this can inform your responses and course of action. By leaning on these look fors and listen fors, we can more effectively and confidently determine if, when, and how many scaffolds need to be introduced so that students are advancing their skills.
Preemptive planning
In addition to locating student behaviors and communication that can surface as they’re working on tasks, we can also lean on preemptive planning, which will support us in predicting and creating opportunities for productive struggle in our classrooms. Preemptive planning encourages thoughtful consideration and analysis of each task we’re offering students — whether it’s a culminating task of a unit or a task for a particular lesson. This process involves asking questions such as:
These questions can inform your instructional design as you work to meet the individual needs of students. Your assessment of students’ areas of struggle can inform where and how you’ll need to scaffold your instruction, and the areas of strength can inform where and how you introduce extensions or opportunities for deeper learning. Similar to identifying and responding to look fors and listen fors, preemptive planning is a meaningful form of data collection that can help you predict or anticipate student performance and ultimately use these predictions to inspire, inform, and cater your planning and instruction to your students.
There is no denying the challenges involved in meeting students where they are, especially when teaching large numbers of students — many of whom are on vastly different levels — or teaching at a distance. But by engaging in strategic planning based on data and evidence, we can create opportunities for our students to challenge themselves, recognize that they can do hard things, and make progress in their learning.
Engage in low-inference observations that can lead to new discoveries about your students' needs.
Behavior is a form of communication.
Plug this sentence into your preferred search engine, and it will return enough results to keep you reading for hours. Since it’s more likely you’re scrolling through this post on your train ride, between classes, or at lunch than sitting with a cup of tea and hours to spare, let’s connect the dots quickly and consider a way to look at communication through behavior as data that we can analyze to determine a productive possibility in our classrooms.
Collect the data: what is the behavior?
Start with documenting low inference observations of behavior. As you jot down the description of the behavior, challenge yourself to write only what is observable. When you write, “a fight broke out,” ask yourself, “what did I actually see?”. What did you see? What did you hear? It’s worth the time it takes to develop your low inference observation skills, because you will be working from more accurate data, as free of assumptions as possible.
Analyze the data: what might it mean?
While you already have classroom expectations clearly outlined and students may be fully aware of the consequences of certain behaviors, whether it is a phone call home, a visit to the AP, or other intervention, you may also consider using a tool to support your own problem-solving. This is especially helpful when you’re confronted with a persistent, or even new, behavior. Lifelines is a tool we’ve used with our partner schools when looking at data reports together. With a few customizations, we can use this tool to explore behavior as data.
Consider possibilities: apply analysis to inform instruction
What questions, lessons, or interventions make sense to support the student and their learning? Is it an individual moment that is needed or could the whole class benefit from some time investigating this issue together? Try your lesson or intervention out. What happened? What other questions came up? What might you try next? If you continue to remain curious, your “final” determination in the Lifelines tool can be a starting point to a simple cycle of inquiry.
By engaging your curiosity and making low inference observations of student behavior, you can engage in an inquiry cycle that could result in new and exciting discoveries about what your students’ behavior is communicating. Your findings can then support students academically by addressing their social-emotional needs.
|
|