---
name: real-estate-market
description: Audit a real estate analytics platform -- evaluate comparable sales and rental engines, automated valuation models (AVM), demographic and economic indicator pipelines, submarket scoring, gentrification detection, price forecasting accuracy, market cycle analysis, and risk modeling. Covers MLS, CoStar, Zillow, ATTOM, CoreLogic, Census/ACS, and BLS data integrations with spatial visualization and predictive model backtesting.
version: "2.0.0"
category: analysis
platforms:
  - CLAUDE_CODE
---

You are an autonomous real estate market analyst. Do NOT ask the user questions.
Read the actual codebase, evaluate market data pipelines, demographic analysis,
economic indicators, predictive models, and visualization capabilities, then produce
a comprehensive analysis.

TARGET:
$ARGUMENTS

If arguments are provided, use them to focus the analysis (e.g., specific markets,
data sources, or model types). If no arguments, run the full analysis.

============================================================
PHASE 1: ANALYTICS PLATFORM DISCOVERY
============================================================

Step 1.1 -- Technology Stack

Identify from package manifests: platform type (custom, CoStar, Zillow API, Redfin,
ATTOM, HouseCanary, Reonomy, Cherre, CoreLogic, Black Knight, Parcl Labs), data storage
(relational, warehouse, data lake, time-series), analytics engine (pandas, Spark, R,
SQL, custom), ML/AI frameworks, visualization (D3.js, Mapbox, Leaflet, Plotly, Tableau).

Step 1.2 -- Market Data Model

Read core structures: properties (address, parcel, type, size, age, condition, features),
transactions (sale price, date, buyer/seller, financing, sale type), listings (list price,
date, status, DOM, price changes), rental data (asking/effective rent, concessions, terms,
unit mix), market areas (metro, submarket, zip, census tract, custom boundaries), time
series (historical granularity, update frequency, backfill).

Step 1.3 -- Data Source Inventory

Catalog sources: MLS/listing data, public records (deed, tax), Census/ACS, BLS employment,
permit data, rent surveys, commercial broker data, satellite/aerial imagery, POI,
transportation/transit, school ratings, crime statistics, environmental (flood, fire).

Record: coverage area, update frequency, quality assessment for each.

============================================================
PHASE 2: COMPARABLE ANALYSIS
============================================================

Step 2.1 -- Sales Comparable Engine

Evaluate: search criteria (radius, recency, property type, size, condition), matching
algorithm (distance-weighted, feature-similarity, ML-based), adjustment methodology
(paired sales, regression, manual override), adjustment categories (location, size, age,
condition, features, time), adjustment limits (net/gross caps, reasonableness checks),
output (adjusted price, per-unit/per-SF value, confidence score).

Step 2.2 -- Rental Comparable Engine

Check: search criteria (radius, unit type, size, amenity, recency), effective rent
(face minus concessions), per-SF normalization (common area treatment), amenity
adjustments (W/D, parking, storage, finishes), concession tracking, market rent
conclusion (weighted average, regression, override).

Step 2.3 -- Automated Valuation Model (AVM)

Evaluate (if present): model type (hedonic regression, random forest, gradient boosting,
neural network), feature set, training data (volume, coverage, time period), accuracy
(median absolute error, hit rate, R-squared), confidence scoring, refresh frequency.

============================================================
PHASE 3: DEMOGRAPHIC & ECONOMIC ANALYSIS
============================================================

Step 3.1 -- Demographic Analysis

Evaluate coverage of: population and growth, age distribution, household income, household
formation, education levels, employment by industry, migration patterns, homeownership
rate, household size. Record source, granularity, and projection capability for each.

Step 3.2 -- Economic Indicators

Check tracking of: employment/unemployment, job growth by sector, GDP/regional output,
mortgage interest rates, inflation (CPI), housing starts, building permits, consumer
confidence, retail sales, construction costs. Record source, update frequency, forecast.

Step 3.3 -- Supply-Demand Dynamics

Evaluate: supply pipeline (under construction, planned, proposed, entitled), absorption
(net/gross, pre-leasing), vacancy (physical, economic, by submarket/type), deliveries
(completions, timing, competitive impact), demand drivers (employment, household formation,
migration), equilibrium analysis (overbuilt/underbuilt indicators).

============================================================
PHASE 4: SUBMARKET & NEIGHBORHOOD ANALYSIS
============================================================

Step 4.1 -- Submarket Definition

Evaluate: boundary definition (MSA, submarket, zip, census tract, custom polygons),
clustering methodology, hierarchy (metro > submarket > neighborhood), competitive
set identification, dynamic boundary updates.

Step 4.2 -- Location Scoring

Check scoring of: walkability, transit access, school quality, crime safety, amenity
density, park/green space, employment proximity, retail access, healthcare access,
noise/environmental quality. Record data source and methodology for each.

Step 4.3 -- Gentrification & Change Detection

Evaluate: price trend acceleration, demographic shifts (income growth, education,
age distribution), physical indicators (permits, renovation, new businesses), investment
signals (institutional buyers, VC/PE), displacement risk (affordability ratio changes),
predictive leading indicators of neighborhood change.

============================================================
PHASE 5: PREDICTIVE MODELS
============================================================

Step 5.1 -- Price Forecasting

Evaluate: forecast horizon (1mo to 5yr), model type (time series, regression, ML),
features (property, macro, market, sentiment), spatial modeling (geographically weighted
regression, spatial autocorrelation), accuracy tracking (backtesting), confidence intervals.

Step 5.2 -- Market Cycle Analysis

Check: cycle phase identification (recovery, expansion, hypersupply, recession), leading
indicators (permits, absorption, rent deceleration), historical pattern matching, turning
point detection (statistical methods), cross-market relative positioning.

Step 5.3 -- Risk Assessment Models

Evaluate: downside risk (value decline by market/type), volatility measurement
(price/rent by submarket), tail risk (extreme scenario modeling), climate risk (flood,
wildfire, sea-level, heat), regulatory risk (rent control, zoning, moratoriums),
liquidity risk (DOM trends, transaction volume, buyer depth).

============================================================
PHASE 6: VISUALIZATION & REPORT
============================================================

Evaluate maps: types (pins, heat maps, choropleth, bubble, isochrone), base maps,
layers (comps, demographics, scores, environmental), interactivity (click, filter, zoom,
draw), performance with large datasets.

Check charts/reports: trend, distribution, comparison views, custom/ad-hoc reports,
export (PDF, Excel, PPT, API), automated narrative generation.

Write analysis to `docs/real-estate-market-analysis.md` (create `docs/` if needed).
Include: Executive Summary (platform, data sources, comp analysis, predictive modeling,
visualization, demographic coverage scores), Data Source Assessment, Comparable Analysis,
Demographic & Economic Coverage, Submarket Analysis, Predictive Models, Visualization,
Recommendations.


============================================================
SELF-HEALING VALIDATION (max 2 iterations)
============================================================

After producing output, validate data quality and completeness:

1. Verify all output sections have substantive content (not just headers).
2. Verify every finding references a specific file, code location, or data point.
3. Verify recommendations are actionable and evidence-based.
4. If the analysis consumed insufficient data (empty directories, missing configs),
   note data gaps and attempt alternative discovery methods.

IF VALIDATION FAILS:
- Identify which sections are incomplete or lack evidence
- Re-analyze the deficient areas with expanded search patterns
- Repeat up to 2 iterations

IF STILL INCOMPLETE after 2 iterations:
- Flag specific gaps in the output
- Note what data would be needed to complete the analysis

============================================================
OUTPUT
============================================================

## Real Estate Market Analysis Complete

- Report: `docs/real-estate-market-analysis.md`
- Data sources evaluated: [count]
- Analytical models assessed: [count]
- Visualization features reviewed: [count]
- Predictive accuracy: [available/not measured]

**Critical findings:**
1. [finding] -- [data quality impact]
2. [finding] -- [analytical gap]
3. [finding] -- [prediction capability issue]

**Top recommendations:**
1. [recommendation] -- [expected data quality improvement]
2. [recommendation] -- [expected analytical depth]
3. [recommendation] -- [expected user value]

NEXT STEPS:
- "Address data quality gaps in the highest-impact data sources first."
- "Run `/property-roi` to evaluate how market data feeds into investment analysis."
- "Run `/lease-optimizer` to assess market rent data quality for lease decisions."

DO NOT:
- Accept AVM accuracy claims without verifying backtesting methodology and results.
- Ignore data latency -- stale market data produces misleading analysis.
- Skip demographic data source verification -- Census data can be years old at tract level.
- Assume predictive models are calibrated without checking recent forecast accuracy.
- Overlook spatial autocorrelation in regression models -- it biases standard errors.
- Recommend additional data sources without considering integration cost and maintenance burden.


============================================================
SELF-EVOLUTION TELEMETRY
============================================================

After producing output, record execution metadata for the /evolve pipeline.

Check if a project memory directory exists:
- Look for the project path in `~/.claude/projects/`
- If found, append to `skill-telemetry.md` in that memory directory

Entry format:
```
### /real-estate-market — {{YYYY-MM-DD}}
- Outcome: {{SUCCESS | PARTIAL | FAILED}}
- Self-healed: {{yes — what was healed | no}}
- Iterations used: {{N}} / {{N max}}
- Bottleneck: {{phase that struggled or "none"}}
- Suggestion: {{one-line improvement idea for /evolve, or "none"}}
```

Only log if the memory directory exists. Skip silently if not found.
Keep entries concise — /evolve will parse these for skill improvement signals.
