Measurement and Evaluation Plan
This document defines the comprehensive evaluation framework for the El Segundo USD AI Literacy and Workforce Development Project. The framework establishes baseline measurements, tracks progress through interim milestones, and validates outcomes against program objectives.
Evaluation Framework Overview
Evaluation Purpose
| Purpose | Description | Stakeholder |
|---|
| Accountability | Demonstrate responsible use of resources and achievement of stated goals | School Board; Funders |
| Improvement | Identify what works and what needs adjustment for continuous enhancement | Program Team; Teachers |
| Learning | Generate insights transferable to other contexts and future initiatives | Education Community; Researchers |
| Validation | Confirm program theory and expected causal relationships | Leadership; Partners |
Evaluation Approach
Mixed-Methods Design:
- Quantitative metrics for measurable outcomes
- Qualitative data for context and depth
- Formative assessment for ongoing improvement
- Summative evaluation for outcome determination
Evaluation Timeline:
| Phase | Timing | Focus |
|---|
| Baseline | Month 1 | Current state documentation |
| Formative | Months 2-11 | Ongoing monitoring and adjustment |
| Interim Summative | Month 12 | Year 1 outcome assessment |
| Final Summative | Month 18 | Full program evaluation |
Baseline Metrics (Current State)
Teacher Baseline Assessment
Survey Instrument: Teacher AI Readiness Assessment
| Dimension | Survey Items | Scale | Target Sample |
|---|
| AI Knowledge | Self-assessed understanding of AI concepts and applications | 1-5 Likert | All teachers (350+) |
| AI Tool Experience | Prior use of AI tools (personal and professional) | Frequency scale | All teachers |
| AI Confidence | Comfort level teaching with and about AI | 1-5 Likert | All teachers |
| AI Attitude | Beliefs about AI in education (benefits, risks, appropriateness) | 1-5 Likert | All teachers |
| Training Needs | Self-identified areas for professional development | Ranking | All teachers |
Baseline Data Collection:
| Metric | Measurement Method | Timing | Owner |
|---|
| Teacher AI literacy rate | Knowledge assessment (20-item quiz) | Month 1 | SA |
| Teacher AI tool usage | Self-report survey | Month 1 | SA |
| Teacher AI confidence | 5-point scale survey | Month 1 | SA |
| Existing AI integration | Lesson plan review; Observation sample | Month 1-2 | Skafld |
| Professional development history | HR records review | Month 1 | ESUSD |
Student Baseline Assessment
Survey Instrument: Student AI Experience Survey (Grade-Appropriate Versions)
| Grade Band | Assessment Components | Administration |
|---|
| K-2 | Observational checklist; Picture-based recognition | Teacher-administered |
| 3-5 | AI recognition quiz; Self-report experience survey | Classroom administration |
| 6-8 | Knowledge assessment; Attitude survey; Tool experience inventory | Online administration |
| 9-12 | Comprehensive assessment; Career readiness indicators; Attitude survey | Online administration |
Baseline Data Collection:
| Metric | Measurement Method | Timing | Owner |
|---|
| Student AI awareness | Grade-appropriate assessment | Month 2 | SA |
| Student AI tool usage | Self-report (6-12) | Month 2 | SA |
| Student AI confidence | 5-point scale (3-12) | Month 2 | SA |
| Student AI career interest | Career survey (6-12) | Month 2 | SA |
| Gender gap baseline | Disaggregated analysis of above | Month 2 | SA |
Administrator Baseline Assessment
| Metric | Measurement Method | Timing | Owner |
|---|
| Strategic AI understanding | Pre-workshop assessment | Month 1 | Skafld |
| Policy readiness | Gap analysis of current policies | Month 1 | SA |
| Resource allocation awareness | Budget review | Month 1 | ESUSD |
System Baseline Assessment
| Metric | Measurement Method | Timing | Owner |
|---|
| AI tools currently approved | Technology inventory | Month 1 | ESUSD IT |
| Infrastructure capacity | Technical assessment | Month 1 | ESUSD IT |
| Existing curriculum AI content | Curriculum audit | Month 1-2 | Skafld |
| Community sentiment | Parent survey sample | Month 1 | ESUSD |
Interim Milestones
Teacher Adoption Metrics
Cohort 1 Champions (Months 2-6):
| Milestone | Target | Measurement | Timing |
|---|
| Recruitment completion | 20 teachers enrolled | Enrollment records | Month 2 |
| Training attendance | 90% attendance rate | Attendance tracking | Months 2-4 |
| Competency assessment | 90% pass rate | Post-training assessment | Month 4 |
| Lesson plan completion | 5 plans per teacher (100 total) | Plan submission review | Month 5 |
| Classroom implementation | 80% implementing weekly | Self-report + observation | Month 6 |
| Peer support engagement | 75% providing peer support | Activity tracking | Month 6 |
Cohort 2 Teachers (Months 5-8):
| Milestone | Target | Measurement | Timing |
|---|
| Recruitment completion | 50 teachers enrolled | Enrollment records | Month 5 |
| Training attendance | 85% attendance rate | Attendance tracking | Months 5-7 |
| Competency assessment | 85% pass rate | Post-training assessment | Month 7 |
| Lesson plan completion | 3 plans per teacher (150 total) | Plan submission review | Month 8 |
Full Faculty (Months 5-12):
| Milestone | Target | Measurement | Timing |
|---|
| Module completion | 100% complete asynchronous | LMS tracking | Month 10 |
| Workshop attendance | 90% attend synchronous | Attendance tracking | Month 10 |
| Certification | 100% certified | Certification records | Month 12 |
| Active integration | 80% using AI monthly | Usage survey | Month 12 |
Student Engagement Metrics
Studio Teams (Months 4-12):
| Milestone | Target | Measurement | Timing |
|---|
| Pilot enrollment | 50 students | Enrollment records | Month 4 |
| Gender balance | 50% female | Demographic analysis | Month 4 |
| Attendance rate | 80% weekly | Attendance tracking | Monthly |
| Project completion | 100% complete semester project | Project submission | Month 8 |
| Portfolio exhibition | 100% present at showcase | Exhibition participation | Month 8 |
| Expansion enrollment | 100+ students | Enrollment records | Month 12 |
Foundational Curriculum (Months 3-12):
| Milestone | Target | Measurement | Timing |
|---|
| Curriculum delivery | All grades receive AI instruction | Implementation tracking | Month 12 |
| Student engagement | 80% positive engagement ratings | Student surveys | Quarterly |
| Learning objective mastery | 70% meet grade-level objectives | Assessment data | Semester-end |
Program Health Metrics
| Metric | Target | Measurement | Frequency |
|---|
| Teacher satisfaction | 4/5 average | Survey | Monthly |
| Student satisfaction | 4/5 average | Survey | Monthly |
| Parent awareness | 80% aware of program | Survey | Quarterly |
| Parent support | 70% supportive | Survey | Quarterly |
| Employer engagement | 10+ active partners | Partnership tracking | Monthly |
| Budget adherence | Within 5% of plan | Financial tracking | Monthly |
Outcome Metrics
Student Capability Outcomes
AI Literacy Achievement:
| Outcome | Measurement | Target | Timing |
|---|
| K-2 AI awareness | Observational checklist | 90% meet benchmarks | Year-end |
| 3-5 AI literacy | Grade-level assessment | 80% proficient | Year-end |
| 6-8 AI proficiency | Skills assessment | 75% proficient | Year-end |
| 9-12 AI mastery | Portfolio assessment | 70% meet standards | Year-end |
Portfolio Quality:
| Outcome | Measurement | Target | Timing |
|---|
| Internal quality score | Rubric-based review | 3/4 average | Semester-end |
| Employer quality score | External review | 7/10 average | Year-end |
| Portfolio completeness | Checklist verification | 90% complete | Year-end |
Gender Equity:
| Outcome | Measurement | Target | Timing |
|---|
| Female enrollment parity | Demographic analysis | 50% (+/- 5%) | Ongoing |
| Female confidence parity | Survey comparison | Gap < 10% | Year-end |
| Female performance parity | Assessment comparison | Gap < 5% | Year-end |
| Female AI career interest | Survey comparison | Gap < 10% | Year-end |
Teacher Capability Outcomes
| Outcome | Measurement | Target | Timing |
|---|
| AI literacy rate | Post-assessment | 100% literate | Year 1 end |
| Active AI integration | Usage survey | 80% monthly use | Year 1 end |
| Confidence increase | Pre/post survey | +1.5 point increase | Year 1 end |
| Lesson quality | Peer/expert review | 3/4 average | Year 1 end |
| Sustained engagement | Year 2 continuation | 90% continue | Year 2 start |
Employer Validation Outcomes
| Outcome | Measurement | Target | Timing |
|---|
| Partner engagement | Active partnership count | 10+ employers | Year 1 end |
| Portfolio value perception | Employer survey | 80% see value | Year 1 end |
| Hiring consideration | Employer response | 70% would consider | Year 1 end |
| Internship/project offers | Offer count | 5+ offers | Year 1 end |
| Credential recognition | Employer endorsement | 3+ formal endorsements | Year 2 |
Long-Term Outcomes (Year 2-3)
| Outcome | Measurement | Target | Timing |
|---|
| College application advantage | Acceptance rate comparison | +10% for participants | Year 2-3 |
| Career placement rate | Post-graduation tracking | +15% AI-related placement | Year 3 |
| Starting salary comparison | Alumni survey | +15% for AI-proficient | Year 3+ |
| Model replication | District adoption | 3+ districts | Year 3 |
| Grant success | Funding secured | $200K+ additional | Year 2 |
Learning Capture Questions
Weekly Learning Capture
Teacher Champion Check-In:
| Question | Purpose | Response Format |
|---|
| What AI integration worked well this week? | Identify successful practices | Open text + rating |
| What challenges did you encounter? | Surface problems early | Open text + category |
| What support do you need? | Direct resource allocation | Checklist + open text |
| What surprised you? | Capture unexpected learnings | Open text |
| What would you share with colleagues? | Identify best practices | Open text |
Studio Team Session Debrief:
| Question | Purpose | Response Format |
|---|
| What did students accomplish today? | Track progress | Checklist + notes |
| What engagement level did you observe? | Monitor motivation | 1-5 scale + notes |
| What questions or challenges emerged? | Identify support needs | Open text |
| What peer mentoring occurred? | Track cascade model | Observation notes |
Monthly Learning Capture
Teacher Support Assessment:
| Question | Purpose | Response Format |
|---|
| Are teachers feeling supported? | Assess support adequacy | Survey + focus group |
| What additional resources are needed? | Guide resource allocation | Prioritized list |
| Are girls engaging equally? Where are gaps? | Gender equity monitoring | Disaggregated data |
| What are students producing that impresses us? | Identify exemplars | Portfolio review |
| What innovations are teachers developing? | Capture organic improvements | Innovation log |
Program Health Check:
| Question | Purpose | Response Format |
|---|
| Are we on track with milestones? | Progress monitoring | Milestone dashboard |
| What risks are emerging? | Risk management | Risk register update |
| What adjustments are needed? | Course correction | Action item list |
| What should we stop, start, continue? | Strategic adjustment | Structured feedback |
Quarterly Learning Capture
Strategic Assessment:
| Question | Purpose | Response Format |
|---|
| Is gender parity on track? Why/why not? | Equity goal assessment | Data analysis + narrative |
| Are employers seeing value? What feedback? | Validate employer thesis | Interview synthesis |
| What is our biggest blocker? How do we remove it? | Barrier identification | Root cause analysis |
| What patterns are emerging across cohorts? | Systemic insights | Pattern analysis |
| What should we document for other districts? | Knowledge capture | Documentation drafts |
Stakeholder Pulse:
| Question | Purpose | Response Format |
|---|
| Teacher sentiment trend | Track engagement | NPS or satisfaction trend |
| Student sentiment trend | Track engagement | Survey trend analysis |
| Parent awareness and sentiment | Community engagement | Survey results |
| Board confidence level | Political support | Informal assessment |
Annual Learning Capture
Comprehensive Review:
| Question | Purpose | Response Format |
|---|
| Did we hit our milestones? | Accountability | Milestone report |
| What worked best? | Success identification | Evidence-based analysis |
| What needs to change? | Improvement planning | Recommendations |
| What have we learned that others need to know? | Knowledge sharing | Publication drafts |
| Are we achieving equity goals? | Equity assessment | Disaggregated analysis |
| Is the model sustainable? | Sustainability check | Resource analysis |
Reporting Dashboard Design
Dashboard Overview
Purpose: Provide real-time visibility into program progress for all stakeholders.
Access Levels:
| Role | Dashboard Access | Update Frequency |
|---|
| Board Members | Executive summary; Key metrics | Monthly |
| District Leadership | Full dashboard; Drill-down capability | Weekly |
| Program Team | Operational dashboard; Action items | Real-time |
| Teachers | Individual progress; Cohort comparison | Weekly |
| Partners (Skafld/SA) | Full dashboard; Analytical tools | Real-time |
Dashboard Components
Executive Summary View:
+------------------------------------------------------------------+
| EL SEGUNDO USD AI INITIATIVE DASHBOARD |
+------------------------------------------------------------------+
| |
| OVERALL PROGRESS: [=========> ] 45% |
| |
+------------------+------------------+------------------+----------+
| TEACHERS | STUDENTS | EQUITY | EMPLOYERS|
| [=====> ] | [====> ] | [======> ] | [====> ]|
| 70/350 trained | 85/100 enrolled | 48% female | 8/10 |
| Target: 100% | Target: 100 | Target: 50% | Target:10|
+------------------+------------------+------------------+----------+
| |
| KEY METRICS THIS MONTH: |
| - Teacher satisfaction: 4.2/5 (up 0.3) |
| - Student engagement: 87% attendance (on target) |
| - Lessons delivered: 45 this month (ahead of plan) |
| - Employer meetings: 3 completed (2 new partners) |
| |
+------------------------------------------------------------------+
| ALERTS: |
| [!] Cohort 2 enrollment below target - action needed |
| [i] Parent information night scheduled - 3/15 |
+------------------------------------------------------------------+
Teacher Progress View:
| Metric | Visualization | Data Source |
|---|
| Training completion rate | Progress bar by cohort | LMS data |
| AI integration frequency | Heat map by department | Usage surveys |
| Satisfaction trend | Line chart over time | Monthly surveys |
| Lesson plan submissions | Cumulative count | Submission system |
| Support request volume | Bar chart by type | Help desk data |
Student Progress View:
| Metric | Visualization | Data Source |
|---|
| Studio team enrollment | Gauge against target | Enrollment system |
| Gender distribution | Pie chart | Demographic data |
| Attendance rate | Line chart over time | Attendance records |
| Project completion | Progress bar | Project tracking |
| Portfolio scores | Distribution histogram | Assessment data |
Equity Monitoring View:
| Metric | Visualization | Data Source |
|---|
| Gender enrollment comparison | Side-by-side bars | Enrollment data |
| Gender confidence comparison | Comparative gauges | Survey data |
| Gender performance comparison | Box plots | Assessment data |
| Intervention tracking | Checklist | Program records |
Employer Engagement View:
| Metric | Visualization | Data Source |
|---|
| Partner pipeline | Funnel chart | CRM tracking |
| Engagement level | Activity heat map | Meeting/event logs |
| Portfolio feedback scores | Average with trend | Evaluation forms |
| Hiring interest | Percentage gauge | Survey responses |
Dashboard Technical Specifications
| Specification | Requirement |
|---|
| Platform | Web-based; Mobile-responsive |
| Update frequency | Real-time where possible; Daily minimum |
| Data sources | LMS; Survey tools; Enrollment system; Manual entry |
| Security | Role-based access; SSO integration |
| Export capability | PDF reports; CSV data export |
| Alerts | Email/SMS for threshold breaches |
Report Templates
Monthly Board Report (1-2 pages):
- Executive summary paragraph
- Key metrics dashboard snapshot
- Milestone progress checklist
- Notable achievements
- Challenges and mitigations
- Next month priorities
Quarterly Comprehensive Report (5-8 pages):
- Executive summary
- Progress against all milestones
- Detailed metrics analysis
- Qualitative insights (quotes, stories)
- Equity analysis
- Financial summary
- Risk assessment update
- Recommendations and adjustments
Annual Outcomes Report (15-20 pages):
- Executive summary
- Program overview and theory of change
- Complete metrics analysis
- Outcome achievement assessment
- Equity impact analysis
- Stakeholder feedback synthesis
- Lessons learned
- Recommendations for Year 2
- Appendices (data tables, survey instruments)
Data Collection Instruments
Survey Instruments
| Instrument | Target | Administration | Frequency |
|---|
| Teacher AI Readiness Survey | All teachers | Online | Baseline; Annually |
| Teacher Satisfaction Survey | Participating teachers | Online | Monthly |
| Student AI Experience Survey | All students (grade-appropriate) | Classroom/Online | Baseline; Annually |
| Student Program Feedback | Studio team participants | Online | Monthly |
| Parent Awareness Survey | All parents | Online | Baseline; Quarterly |
| Employer Evaluation Form | Partner employers | Paper/Online | Per portfolio review |
Assessment Instruments
| Instrument | Target | Administration | Frequency |
|---|
| Teacher AI Competency Assessment | Training participants | Online/Proctored | Post-training |
| Student AI Literacy Assessment | All students | Classroom | Semester-end |
| Portfolio Rubric | Studio team students | Expert review | Semester-end |
| Lesson Plan Quality Rubric | Teacher submissions | Peer/Expert review | Per submission |
Qualitative Instruments
| Instrument | Target | Administration | Frequency |
|---|
| Teacher Focus Groups | Sample of participants | In-person/Virtual | Quarterly |
| Student Focus Groups | Sample of Studio team | In-person | Quarterly |
| Employer Interviews | Active partners | Phone/In-person | Bi-annually |
| Classroom Observations | Sample of teachers | In-person | Monthly sample |
Evaluation Governance
Evaluation Roles
| Role | Responsibility | Assignment |
|---|
| Evaluation Lead | Overall evaluation design and oversight | Strategic Advisors |
| Data Collection | Instrument administration; Data gathering | ESUSD + SA |
| Data Analysis | Quantitative and qualitative analysis | Strategic Advisors |
| Reporting | Dashboard maintenance; Report preparation | SA + Skafld |
| Interpretation | Making sense of findings; Recommendations | All partners |
| Action | Implementing improvements based on findings | ESUSD + Skafld |
Data Quality Assurance
| Process | Description | Frequency |
|---|
| Instrument validation | Review surveys/assessments for validity | Before use |
| Response rate monitoring | Track and improve participation | Ongoing |
| Data cleaning | Check for errors and inconsistencies | Per collection |
| Inter-rater reliability | Calibrate rubric scoring | Quarterly |
| Audit trail | Document all data handling | Ongoing |
Ethical Considerations
| Consideration | Protocol |
|---|
| Student privacy | FERPA compliance; Parental consent where required |
| Data security | Encrypted storage; Limited access; Retention policies |
| Voluntary participation | Survey participation optional; No penalty for non-response |
| Reporting anonymity | Individual data not reported; Aggregate only |
| Bias awareness | Diverse perspectives in interpretation; Acknowledge limitations |
Evaluation Timeline Summary
| Month | Baseline | Formative | Summative |
|---|
| 1 | All baseline assessments | - | - |
| 2-3 | Complete baseline analysis | Weekly learning capture begins | - |
| 4-5 | - | Monthly reporting; Cohort 1 assessment | - |
| 6 | Mid-year check (student baseline update) | Quarterly comprehensive review | - |
| 7-9 | - | Monthly reporting; Cohort 2 assessment | - |
| 10-11 | - | Monthly reporting | - |
| 12 | - | Final monthly reporting | Year 1 Summative Evaluation |
Related Documents: