The Measurement Revolution
Today's growth measurement relies primarily on periodic standardized assessments. Tomorrow's may capture learning continuously through everyday interactions, use AI to provide personalized insights, and expand beyond academic skills to measure competencies like collaboration and creativity.
The annual state test is returned in summer, months after students have moved to new grades. The interim assessment is administered three times a year, providing snapshots separated by long intervals. Even these relatively "frequent" measures leave vast gaps in our understanding of how learning unfolds day by day, week by week.
What if growth measurement could be continuous? What if every interaction with learning software generated data about skill development? What if AI could analyze learning patterns in real time and identify students who need support before they fall behind?
These questions drive innovation in educational measurement. The next decade will likely bring fundamental changes in how we track, understand, and respond to student growth. Here's what's emerging and what it might mean for educators.
Continuous Assessment and Learning Analytics
The most significant shift may be from periodic to continuous assessment. Instead of formal testing events separated by months, growth data could flow from everyday learning activities.
Embedded Assessment
Digital learning platforms increasingly embed assessment within instructional activities. As students complete problems, write responses, or interact with content, the system captures performance data. Over time, patterns emerge that reveal skill development without requiring separate testing events.
This approach offers several advantages: more frequent measurement, reduced testing burden, and performance captured in authentic learning contexts rather than artificial test conditions. The challenge is ensuring that embedded assessments validly measure the constructs we care about.
Learning Process Data
Beyond just right or wrong answers, digital systems can capture how students arrive at answers. How long do they spend on different problem types? When do they pause or backtrack? What patterns appear in their errors? This "process data" reveals dimensions of learning invisible in outcome-only measures.
A student who gets problems right quickly demonstrates different mastery than one who gets them right after extended struggle. Process data captures these differences, enabling more nuanced understanding of growth.
Real-Time Dashboards
As assessment becomes continuous, visualization must keep pace. Real-time dashboards can show teachers which students are struggling right now, today, not weeks later when test results return. This enables immediate response—adjusting instruction, providing support, intervening before small struggles become significant gaps.
Periodic vs. Continuous Assessment
Periodic (Current Model)
- • 3-4 measurement points per year
- • Formal testing events
- • Results available days to weeks later
- • Snapshot of performance at testing moment
- • Separate from instruction
Continuous (Emerging Model)
- • Daily data from learning activities
- • Embedded in instruction
- • Real-time insights
- • Trajectory visible across all activities
- • Integrated with learning
SCGP Growth Tracking
Track student growth percentiles and measure academic progress with Michigan's SCGP methodology.
AI and Machine Learning in Growth Measurement
Artificial intelligence is transforming what's possible in educational measurement. Several applications are already emerging or on the near horizon:
Adaptive Assessment Evolution
Current adaptive tests like MAP adjust difficulty based on right/wrong responses. Next-generation adaptive systems may use AI to select questions based on much richer data—response patterns, time data, learning history—providing more precise measurement with fewer questions.
Automated Scoring of Complex Work
AI can now score essays and open-ended responses with reliability approaching human raters. This enables growth measurement in domains that resist multiple-choice testing—writing quality, scientific reasoning, mathematical explanation. As AI scoring improves, growth measurement can expand into more authentic and complex demonstrations of learning.
Predictive Analytics
Machine learning models can analyze learning data to predict future outcomes with increasing accuracy. Beyond simple trajectory projection, AI can identify complex patterns that predict risk—combinations of factors that human analysis would miss. This enables earlier, more precise intervention targeting.
Personalized Growth Insights
AI can generate individualized interpretations of growth data—not just "your score is 205" but "you've shown strong growth in literary analysis but slower growth in writing mechanics; here are specific areas to focus on." These personalized insights make growth data more actionable for students, teachers, and families.
Expanding What We Measure
Current growth measurement focuses almost exclusively on academic skills—reading, mathematics, occasionally science and writing. The future may bring valid, reliable measurement of broader competencies:
Social-Emotional Learning
SEL skills like self-regulation, growth mindset, and social awareness are increasingly recognized as important for success. Measuring growth in these areas is challenging—self-report surveys have limitations, and direct observation is expensive. Emerging approaches include analyzing digital interactions, capturing behavioral data, and developing more reliable survey instruments.
21st Century Skills
Collaboration, creativity, critical thinking, communication—these skills are valued by employers and essential for citizenship, yet rarely measured in school accountability. New assessment approaches attempt to capture these competencies through collaborative digital tasks, open-ended problem solving, and portfolio-based evidence.
Physical and Artistic Development
Technology enables measurement of growth in domains previously difficult to assess at scale—motor skill development through movement tracking, artistic skill through digital portfolio analysis, musical growth through audio analysis. The narrow focus on academic subjects may broaden as measurement capabilities expand.
Privacy and Ethical Considerations
As growth measurement becomes more pervasive and AI-powered, significant ethical questions arise:
Data Privacy
Continuous assessment generates vast amounts of student data. Who owns this data? How long is it retained? Who can access it? Current privacy frameworks may be inadequate for the data-rich environments emerging in education.
Algorithmic Bias
AI systems can perpetuate or amplify existing biases. If predictive algorithms are trained on historical data that reflects systemic inequities, their predictions may disadvantage already-marginalized students. Ensuring fairness in AI-powered assessment requires ongoing attention to bias detection and mitigation.
Deterministic Risk
If AI predicts that a student is unlikely to succeed, how should that prediction be used? Predictive analytics intended to help students could instead limit opportunities—tracking students into less rigorous courses or lowering expectations. The technology must serve student growth, not constrain it.
The Surveillance Question
Continuous monitoring of student learning raises surveillance concerns. How much monitoring is appropriate? When does helpful data collection become intrusive surveillance? Students and families should have meaningful input into these questions.
Cohort Analysis
Compare student cohorts over time and identify trends across grade levels and demographics.
Competency-Based Approaches
Competency-based education (CBE) models are reshaping how we think about growth and progression. Instead of time-based advancement (a year of instruction = a year of growth), CBE focuses on demonstrated mastery:
Growth as Mastery Accumulation
In competency-based systems, growth is measured by the competencies students master, not by time spent. A student might master some skills quickly and others slowly; growth tracking captures their progression through the competency map regardless of calendar time.
Personalized Learning Pathways
CBE enables different students to follow different paths through content based on their readiness and interests. Growth measurement must be flexible enough to capture progress along varied pathways, not just movement along a single predetermined sequence.
Implications for Growth Metrics
Traditional growth metrics assume all students are following the same curriculum at the same pace. Competency-based systems require new approaches—perhaps measuring the rate of competency mastery, the breadth of competencies demonstrated, or progress toward personalized goals.
Implementation Challenges
The technologies and approaches described are promising but face significant implementation challenges:
Equity of Access
Technology-enabled assessment requires technology. Schools serving low-income students may lack devices, connectivity, or technical support. New approaches must avoid widening opportunity gaps.
Validation Requirements
Novel assessment approaches must demonstrate validity—that they actually measure what they claim to measure. This requires rigorous research that takes time and resources.
Educator Capacity
New measurement tools are only valuable if educators know how to use them. Training and support must accompany technological deployment.
System Integration
Data from new assessment approaches must integrate with existing systems—student information systems, reporting platforms, accountability frameworks. Fragmented data limits usefulness.
What Won't Change
Amid all the technological change, certain fundamentals will remain constant:
The purpose is still learning. Technology is a tool, not an end. Growth measurement matters only insofar as it helps educators help students learn. Technology that generates data without improving instruction is not progress.
Human judgment remains essential. AI can process data, identify patterns, and generate predictions. But decisions about students should remain human decisions, informed by data but not determined by algorithms.
Relationships matter more than data. The most sophisticated growth measurement system is worthless without educators who know students, care about their success, and respond when data indicates need. Data informs; humans help.
Equity requires intentional focus. New technologies can either reduce or amplify existing inequities. Ensuring that innovations serve all students—not just those in well-resourced schools—requires deliberate attention and design.
Preparing for Change
Schools and districts can prepare for evolving growth measurement by:
Building data literacy. As data becomes richer and more continuous, educators need skills to interpret and act on it. Investing in data literacy now prepares staff for future capabilities.
Establishing ethical frameworks. Develop policies for data privacy, algorithmic fairness, and appropriate use of predictive analytics before technologies force the issue. Proactive ethics is better than reactive scrambling.
Piloting thoughtfully. New assessment approaches should be piloted and evaluated before full deployment. Learn what works in your context before scaling.
Centering student benefit. Every adoption decision should answer: How does this help students learn? Technology for technology's sake doesn't serve educational purposes.
The future of growth measurement is exciting and uncertain. What's certain is that how we measure student learning will continue to evolve—and that keeping the focus on actual learning, not just measurement, will remain the essential task.
Success Stories
See how Michigan charter schools are achieving results with AcumenEd.
Key Takeaways
- Assessment is shifting from periodic testing events to continuous measurement embedded in daily learning activities.
- AI enables automated scoring of complex work, personalized insights, and predictive analytics that identify at-risk students earlier.
- Ethical considerations—privacy, algorithmic bias, surveillance—must be addressed proactively as measurement technology advances.
- Human judgment and relationships remain essential regardless of technological advances—data informs, but humans help.
Marcus Johnson
Director of Data Science
Data scientist specializing in educational analytics with expertise in growth modeling and predictive analytics for student outcomes.



