optimizing-attention-flash is a production-ready Claude Code skill (quality score 70/100) in the general-misc sub-category. It ships as a SKILL.md file that Claude Code auto-discovers under ~/.claude/skills/optimization-flash-attention/ and loads when your prompt matches the skill's trigger.
When to invoke it: Use when training/running transformers with long sequences (>512 tokens), encountering GPU memory issues with attention, or need faster inference. Supports PyTorch native SDPA, flash-attn library, H100 FP8, and sliding window attention.
The optimizing-attention-flash skill is built for Claude Code users and developers across all disciplines looking for general-purpose AI assistance. It is part of the open ClaudSkills registry, a community-curated catalog of 15,000+ capabilities you can install for Claude Code — the Claude CLI agent.
mkdir -p ~/.claude/skills/optimization-flash-attention curl -L https://claudskills.com/skills/optimization-flash-attention/SKILL.md \ -o ~/.claude/skills/optimization-flash-attention/SKILL.md
Or just download SKILL.md directly and drop it into ~/.claude/skills/optimization-flash-attention/. Claude Code auto-discovers it on next session.
Skills live at ~/.claude/skills/optimization-flash-attention/SKILL.md on macOS/Linux, or %USERPROFILE%\.claude\skills\optimization-flash-attention\SKILL.md on Windows. See the full install guide for step-by-step instructions.
The ClaudSkills desktop app installs any skill directly into ~/.claude/skills/ with one click — no terminal required. Pro starts at $9/mo or $149 lifetime.
Browse all General skills in the ClaudSkills registry, or explore these top-rated picks from the same category:
Part of Acreator Store — Adam Lankamer's AI tools: GifPerfect · AspectPerfect · SlomoPerfect · Ucaption · UTagger · AutoXPoster · TestYourSkills