The Professional Learning Opportunity
When growth data is used for learning rather than judgment, teachers engage with it differently. They ask questions, examine patterns, and experiment with new approaches. The same data that feels threatening in an evaluation context becomes powerful fuel for professional growth.
The math department meeting started with a data review. Student growth results were displayed—some teachers' classes showing strong growth, others less so. The principal pointed to the variation: "We need to figure out what's driving these differences."
Immediately, the room's energy shifted. Teachers who showed lower growth became defensive. Those with higher growth stayed quiet, not wanting to seem like they were gloating. The conversation that followed was stilted, unproductive, and left everyone feeling worse.
Contrast this with another school's approach. The department head shared the same type of data but framed it differently: "We're all curious about what drives growth in math. Let's look at our collective results as a puzzle to solve together." Teachers with high-growth classes shared specific practices. Teachers with lower growth asked genuine questions. The conversation generated ideas that everyone could try.
Same data, completely different outcomes. The difference lay in how the data was framed and used.
The Case for Growth Data in Professional Development
Growth data provides something that most professional development lacks: feedback on whether instructional changes actually improve student learning. Without outcome data, teachers can implement new strategies without knowing if they help.
Consider the traditional professional development cycle: teachers attend a workshop, learn a new technique, try it in their classrooms, and... then what? Without growth data, teachers rely on subjective impressions of whether the new approach is working. These impressions may be inaccurate.
Growth data closes the feedback loop. Teachers can see whether students are actually learning more when they implement new practices. This transforms professional development from "try this technique" to "try this technique and let's see if it produces results."
Creating Safety for Data Use
The first math department meeting failed because teachers didn't feel safe examining data that might reflect poorly on them. Creating psychological safety is prerequisite to productive data use:
Separate Learning from Evaluation
When growth data is used for high-stakes evaluation, teachers understandably become defensive. For professional learning purposes, create spaces where data examination is explicitly divorced from evaluation. "This conversation is about learning, not rating."
Start with Collective, Not Individual
Beginning with school-wide or grade-level patterns is less threatening than starting with individual teacher data. "As a school, we're seeing stronger growth in reading than math—let's understand why" invites collective inquiry rather than individual defensiveness.
Normalize Variation
Variation in growth results is normal and expected. Some variation reflects random fluctuation, not meaningful differences in teaching quality. Acknowledging this helps teachers engage with data without feeling that every difference is a judgment on their competence.
Focus on Improvement, Not Blame
The question should be "What can we learn from this data to improve?" not "Who's responsible for these results?" Blame-focused conversations shut down learning; improvement-focused conversations open it up.
Safety Indicators in Data Conversations
Signs of Safety
- • Teachers ask genuine questions
- • People share struggles openly
- • Low-growth teachers participate actively
- • Conversation generates ideas to try
- • Teachers reference data in later discussions
Signs of Threat
- • Silence or defensive responses
- • Blame-shifting to students or circumstances
- • Questioning the validity of data
- • Conversation ends quickly
- • Teachers avoid future data discussions
SCGP Growth Tracking
Track student growth percentiles and measure academic progress with Michigan's SCGP methodology.
Productive Data Inquiry Protocols
Structure supports productive data conversations. Without protocols, discussions can devolve into defensiveness or superficiality. Several protocols have proven effective:
The Data-Driven Dialogue Protocol
This protocol structures data examination through four phases:
Phase 1: Predictions
Before seeing data, participants predict what they expect to see. This surfaces assumptions and creates engagement with upcoming data.
Phase 2: Observations
Participants share what they notice in the data—just observations, no interpretations. "I notice 7th grade shows lower growth than 6th or 8th." Building a shared picture of what the data shows.
Phase 3: Inferences
Now participants offer possible explanations. "Maybe the 7th-grade curriculum transition is challenging." Multiple explanations are encouraged; premature conclusions are avoided.
Phase 4: Implications
What actions does this suggest? "We should look more closely at the 7th-grade transition and what supports students might need." The goal is actionable next steps.
Positive Deviance Inquiry
This approach focuses specifically on identifying what's working rather than what's failing. When some teachers or classrooms show notably strong growth, the inquiry explores what they're doing that might be replicated:
"Ms. Chen's math classes consistently show growth at the 75th percentile or above. What's happening in her classroom that we can learn from?"
This approach avoids the defensiveness triggered by focusing on low performance while still using data to drive improvement. It's harder to feel threatened when the focus is on celebrating and learning from success.
Personal Data Reflection
Before group discussion, individual teachers examine their own data privately using guiding questions:
What patterns do I notice in my growth results across different groups of students?
Where did students grow more than I expected? Less than expected?
What instructional choices might have influenced these results?
What questions does this data raise that I want to explore?
Private reflection allows teachers to process reactions before public discussion, leading to more thoughtful group conversations.
Connecting Data to Practice
Data alone doesn't improve instruction—changes in practice do. Effective professional development using growth data connects patterns in results to specific instructional approaches:
Practice Sharing
When teachers identify classroom practices that seem connected to strong growth, create opportunities to share specifics. Not general principles ("I differentiate instruction") but concrete details ("Here's my small-group rotation schedule and how I use assessment data to form groups").
Classroom Observations
Arrange for teachers to observe colleagues whose students show strong growth. The combination of data (showing results) and observation (showing practice) creates powerful learning opportunities. Observers can see specific practices in action, then track whether implementing those practices improves their own students' growth.
Experimentation and Follow-Up
Professional development should lead to experiments: teachers trying new practices and monitoring results. Follow-up sessions examine growth data to see whether instructional changes produced learning gains. This creates genuine cycles of inquiry.
Cohort Analysis
Compare student cohorts over time and identify trends across grade levels and demographics.
The Problem with High-Stakes Use
A word of caution: using growth data for high-stakes teacher evaluation—merit pay, retention decisions, public rankings—tends to undermine its value for professional learning. Research and practical experience suggest several problems:
Defensiveness replaces curiosity. When results affect employment and compensation, teachers have every reason to defend their data rather than learn from it. The psychological safety needed for genuine inquiry evaporates.
Gaming emerges. High-stakes measures invite manipulation. Teachers may focus narrowly on tested content, teach test-taking strategies, or seek students likely to show growth. These responses undermine the validity of the data.
Collaboration decreases. When teachers are ranked against each other, sharing practices that produce strong results becomes competitively disadvantageous. The collective learning that improves schools gives way to individual positioning.
Statistical issues are amplified. Growth measures for individual teachers have significant measurement error. Results can vary substantially from year to year due to factors unrelated to teaching quality. Using volatile measures for high-stakes decisions produces unfair and unreliable outcomes.
Many assessment organizations, including NWEA, explicitly recommend against using their growth data for high-stakes teacher evaluation. The same data used productively for professional learning becomes counterproductive when attached to high stakes.
Building Data Literacy
Productive data use requires data literacy—understanding what measures mean, their limitations, and appropriate interpretations. Professional development should build this literacy:
Understanding growth metrics. Teachers should understand the difference between achievement and growth, what percentiles and projections mean, and how conditional growth percentiles work. Misinterpretation undermines useful application.
Recognizing measurement limitations. All assessments have error. Small differences may not be meaningful. Single-year results can fluctuate. Teachers who understand these limitations avoid over-interpreting data while still learning from it.
Connecting data to context. Growth results are influenced by many factors beyond instruction—student mobility, attendance, life circumstances, prior educational experience. Skilled data users consider context when interpreting results.
Asking productive questions. Data literacy includes knowing what questions to ask: What patterns appear across student groups? How does this year compare to last year? What does this suggest about instruction? What additional information would help understand this pattern?
Leadership's Role
School leaders shape whether data use is productive or counterproductive:
Model learning orientation. Leaders who treat their own data with curiosity and openness set the tone. "Our school's growth was lower than I hoped. I want to understand why and what I can do differently as a leader."
Protect psychological safety. Leaders must actively create and maintain safety for honest data discussion. This means intervening when conversations become blaming, celebrating risk-taking, and ensuring that candid engagement with data is rewarded rather than punished.
Focus on learning, not ranking. Avoid using data to compare and rank teachers publicly. Celebrate strong results without creating competitive dynamics. Focus discussions on "what can we learn?" not "who's best and worst?"
Ensure follow-through. Data discussions should lead to action—and action should be monitored and supported. Leaders ensure that insights from data analysis translate into actual changes in practice and that teachers receive support in implementing changes.
Success Stories
See how Michigan charter schools are achieving results with AcumenEd.
The Payoff
When growth data is used well for professional development, the results are powerful. Teachers engage with evidence about their practice in ways that drive real improvement. Successful practices spread through teams and schools. Professional learning becomes grounded in outcomes rather than activities.
Return to the math department that learned to examine data productively. Over two years, their collective growth percentiles improved significantly. More importantly, teachers reported feeling more professional—treating their practice as something to be studied and refined rather than simply delivered.
"I never thought I'd say this about data," one teacher reflected, "but our growth conversations are the most valuable PD we have. We actually learn things that change what we do in our classrooms."
That's the potential of growth data in professional development: not numbers for judgment, but evidence for learning.
Key Takeaways
- Creating psychological safety is prerequisite to productive data use—separate learning conversations from evaluation.
- Structured protocols help data conversations stay productive rather than devolving into defensiveness.
- High-stakes use of growth data for evaluation tends to undermine its value for professional learning.
- Leaders shape data culture by modeling curiosity, protecting safety, and ensuring discussions lead to action.
James Okonkwo
Senior Implementation Specialist
Former charter school administrator with deep expertise in Michigan charter school accountability and authorizer relations.



