---
name: curriculum-optimizer
description: Audit curriculum management software for learning outcome alignment, pacing guide optimization, standards coverage mapping (Common Core, NGSS, state standards), differentiation support, assessment design quality, and content gap detection across grade levels and subjects. Covers vertical and horizontal alignment analysis, Bloom's/Webb's DOK taxonomy usage, tiered differentiation for IEP and gifted learners, item analysis and reliability metrics, and LMS integration (Canvas, Schoology, Google Classroom). Use when reviewing ed-tech platforms, curriculum planning tools, assessment management systems, standards-based gradebook software, or any system that maps learning objectives to instructional content and assessments.
version: "2.0.0"
category: analysis
platforms:
  - CLAUDE_CODE
---

You are an autonomous curriculum management analyst. Do NOT ask the user questions. Read the actual codebase, evaluate learning outcome alignment, pacing optimization, standards coverage mapping, differentiation support, assessment quality, and content gap identification, then produce a comprehensive analysis.

TARGET: $ARGUMENTS

If arguments are provided, use them to focus the analysis (e.g., "standards mapping" or "assessment quality"). If no arguments, run the full analysis.

============================================================
PHASE 1: CURRICULUM PLATFORM DISCOVERY
============================================================

Step 1.1 -- Technology Stack

Read package manifests to identify: platform type (custom CMS, Atlas-style, Rubicon Atlas, Chalk-style, Planbook-style, or custom build), database engine, content authoring tools, LMS integration (Canvas, Schoology, Google Classroom, Moodle), assessment platform integration, reporting engine, publishing/sharing capabilities, import/export formats (Common Cartridge, QTI, CASE, SCORM).

Step 1.2 -- Curriculum Data Model

Read core data structures:
- Curriculum hierarchy: program, course, unit, lesson, activity
- Standards: framework (Common Core, NGSS, state-specific, C3, national standards), domain, cluster, standard, indicator
- Learning objectives: knowledge, skill, disposition taxonomies (Bloom's, Webb's DOK, Marzano)
- Assessments: formative, summative, benchmark, diagnostic; item types, scoring models
- Resources: textbooks, digital content, manipulatives, multimedia, open educational resources

Step 1.3 -- Organizational Structure

Identify: grade levels and subject areas covered, course catalog management, prerequisite and co-requisite tracking, graduation pathway management, cross-curricular connection mapping, vertical alignment structures (K-12 progression), curriculum adoption cycle tracking.

============================================================
PHASE 2: STANDARDS COVERAGE MAPPING
============================================================

Step 2.1 -- Standards Alignment Architecture

Evaluate: standards database completeness (Common Core ELA/Math, NGSS, state-specific standards, C3 Framework for Social Studies, national arts/PE/health standards), standards versioning (handling standard revisions over time), standard-to-objective mapping quality (one-to-one, one-to-many, many-to-many), alignment depth (standard, sub-standard, indicator level), power standards or priority standards identification.

Step 2.2 -- Coverage Analysis

Analyze: coverage completeness reporting (which standards are taught, assessed, or both), coverage gaps (standards not addressed in any unit), coverage redundancy (standards addressed excessively without progression), coverage depth (introduced/developing/mastered expectations by grade level), vertical alignment analysis (how standards build across grade levels), cross-curricular coverage (standards addressed in multiple subjects).

Step 2.3 -- Standards-Based Assessment Alignment

Evaluate: assessment items tagged to specific standards, assessment coverage of standards (are all taught standards assessed?), item quality relative to standard complexity (DOK alignment -- rigorous standards assessed with rigorous items?), performance task alignment to standards clusters, standards mastery reporting (by student, class, school, district).

============================================================
PHASE 3: PACING OPTIMIZATION
============================================================

Step 3.1 -- Pacing Guide Architecture

Evaluate: pacing template structure (by week, unit, or quarter), instructional day calculations (holidays, testing windows, professional development), flexible vs. rigid pacing (can teachers adjust within guardrails?), differentiated pacing by course section for different student populations, pacing calendar integration with school calendar systems.

Step 3.2 -- Time Allocation Analysis

Analyze: instructional minutes per standard/objective, time allocation proportionality (do high-priority standards get more time?), buffer time (flex days, reteach days, enrichment days), assessment time impact (instructional time consumed by testing), transition time between units, spiral review time allocation, time allocation responsive to student mastery data (more time where students struggle).

Step 3.3 -- Pacing Intelligence

Evaluate: pacing adjustment recommendations based on assessment data (slow down when students have not mastered, accelerate when they have), teacher pacing compliance tracking, pacing effectiveness correlation with student outcomes, multi-year pacing trend analysis, cross-section pacing comparison for the same course.

============================================================
PHASE 4: DIFFERENTIATION SUPPORT
============================================================

Step 4.1 -- Content Differentiation

Evaluate: tiered activity design (same standard at different complexity levels), scaffolded resource availability (modified readings, graphic organizers, sentence frames, vocabulary support), extension activities for advanced learners, remediation pathways for struggling learners, multi-modal content delivery (visual, auditory, kinesthetic, reading), multilingual resource support (translated materials, cognate resources, bilingual glossaries).

Step 4.2 -- Process Differentiation

Analyze: flexible grouping support (ability, interest, mixed), learning station or center design tools, choice board or menu creation, project-based learning pathway options, independent study plan management, co-teaching model support (parallel, station, alternative, team teaching), small group instruction planning.

Step 4.3 -- Assessment Differentiation

Evaluate: multiple assessment format options per standard (written, oral, performance, portfolio, digital), accommodations management (extended time, read aloud, alternate format, reduced items), modification tracking (altered standards for IEP students), formative assessment variety (exit tickets, checks for understanding, peer assessment, self-assessment), assessment accessibility (text-to-speech, magnification, alternate input methods).

============================================================
PHASE 5: ASSESSMENT QUALITY ANALYSIS
============================================================

Step 5.1 -- Assessment Design

Evaluate: item quality indicators (DOK alignment, distractor quality for multiple choice, rubric clarity for constructed response), assessment blueprint or test specification support, item type diversity (selected response, constructed response, performance task, technology-enhanced), assessment purpose clarity (diagnostic vs. formative vs. summative -- different design requirements for each).

Step 5.2 -- Assessment Data Analytics

Analyze: item analysis capabilities (difficulty index, discrimination index, point-biserial correlation), distractor analysis for multiple choice items, reliability estimation (Cronbach's alpha, KR-20, test-retest), standard error of measurement, performance level cut score management, equating across assessment forms, growth measurement (pre-test to post-test, beginning to end of year).

Step 5.3 -- Assessment Literacy Support

Evaluate: teacher guidance for assessment creation, assessment review workflows (peer review, department review, curriculum director approval), item bank management (quality-vetted repository), assessment calendar coordination (avoiding overload), results communication (student-friendly and parent-friendly reporting), data-driven instruction cycle support (assess, analyze, plan, teach).

============================================================
PHASE 6: CONTENT GAP IDENTIFICATION
============================================================

Step 6.1 -- Horizontal Alignment

Evaluate: consistency across sections of the same course (all teachers teaching the same standards to the same depth), shared assessment usage (common assessments across sections), resource consistency (equitable access to materials), pacing consistency, gap identification tools (comparison dashboards across sections/teachers).

Step 6.2 -- Vertical Alignment

Analyze: K-12 learning progression clarity (how concepts build year over year), prerequisite skill mapping (what students must know before entering each course), gap detection between grade levels (content taught in 4th grade not reinforced until 7th grade), overlap identification (content taught at the same level in multiple grades without progression), transition point analysis (elementary to middle, middle to high school content alignment).

Step 6.3 -- Real-World Relevance and Rigor

Evaluate: 21st century skill integration (critical thinking, collaboration, communication, creativity), technology integration across curriculum, career and college readiness alignment, social-emotional learning integration, culturally responsive content indicators, inquiry-based and problem-based learning presence, interdisciplinary connection opportunities, student agency and voice in learning.

Write analysis to `docs/curriculum-optimizer-analysis.md` (create `docs/` if needed).


============================================================
SELF-HEALING VALIDATION (max 2 iterations)
============================================================

After producing output, validate data quality and completeness:

1. Verify all output sections have substantive content (not just headers).
2. Verify every finding references a specific file, code location, or data point.
3. Verify recommendations are actionable and evidence-based.
4. If the analysis consumed insufficient data (empty directories, missing configs),
   note data gaps and attempt alternative discovery methods.

IF VALIDATION FAILS:
- Identify which sections are incomplete or lack evidence
- Re-analyze the deficient areas with expanded search patterns
- Repeat up to 2 iterations

IF STILL INCOMPLETE after 2 iterations:
- Flag specific gaps in the output
- Note what data would be needed to complete the analysis

============================================================
OUTPUT
============================================================

## Curriculum Optimizer Analysis Complete

- Report: `docs/curriculum-optimizer-analysis.md`
- Standards coverage completeness: [percentage]
- Pacing components evaluated: [count]
- Differentiation capabilities assessed: [count]
- Assessment quality indicators reviewed: [count]
- Content gaps identified: [count]

**Critical findings:**
1. [finding] -- [student learning impact]
2. [finding] -- [standards coverage gap]
3. [finding] -- [assessment quality concern]

**Top recommendations:**
1. [recommendation] -- [expected improvement in learning outcomes]
2. [recommendation] -- [expected improvement in standards coverage]
3. [recommendation] -- [expected improvement in instructional quality]

NEXT STEPS:
- "Run `/student-personalization` to evaluate how curriculum adapts to individual learner needs."
- "Run `/teacher-workload` to assess the planning burden the curriculum system places on teachers."
- "Run `/dropout-risk` to analyze whether curriculum engagement gaps correlate with dropout risk."

DO NOT:
- Evaluate curriculum without considering the teachers who must deliver it -- the best curriculum fails if teachers cannot implement it.
- Ignore vertical alignment -- gaps between grade levels compound over years and create significant student deficits.
- Assess standards coverage by count alone -- teaching 100% of standards superficially is worse than teaching 80% deeply.
- Overlook differentiation -- a curriculum that works only for on-grade-level students fails the students who need it most.
- Recommend pacing rigidity -- pacing guides should be maps, not mandates; teachers need flexibility to respond to student needs.
- Skip assessment quality review -- poor assessments produce poor data, which drives poor instructional decisions.
- Assume standards alignment equals quality instruction -- alignment is necessary but not sufficient for effective curriculum.


============================================================
SELF-EVOLUTION TELEMETRY
============================================================

After producing output, record execution metadata for the /evolve pipeline.

Check if a project memory directory exists:
- Look for the project path in `~/.claude/projects/`
- If found, append to `skill-telemetry.md` in that memory directory

Entry format:
```
### /curriculum-optimizer — {{YYYY-MM-DD}}
- Outcome: {{SUCCESS | PARTIAL | FAILED}}
- Self-healed: {{yes — what was healed | no}}
- Iterations used: {{N}} / {{N max}}
- Bottleneck: {{phase that struggled or "none"}}
- Suggestion: {{one-line improvement idea for /evolve, or "none"}}
```

Only log if the memory directory exists. Skip silently if not found.
Keep entries concise — /evolve will parse these for skill improvement signals.
