Draft
Evidence before elegance
Every section starts as a field inventory: what operators see in incidents, what vendors claim, what attackers can actually reproduce, and what breaks when the signal meets production traffic.
About · Methodology
The standard is not a pile of lectures. It is a slow editorial process that turns production incidents, adversarial technique, and review from working practitioners into a curriculum candidates can trust.
Draft
Every section starts as a field inventory: what operators see in incidents, what vendors claim, what attackers can actually reproduce, and what breaks when the signal meets production traffic.
Review
Working-group reviewers annotate drafts for missing threat models, untestable claims, weak lab assumptions, and language that drifts into product marketing.
Validate
A lab is not accepted until a candidate can run it from a clean environment, produce the expected artifact, and understand what a false positive or false negative would cost.
Govern
The curriculum can react to the field quickly. The exam blueprint changes only through recorded review so candidates are not surprised by unstable assessment targets.
Peer review
Reviewers do not merely approve copy. They try to break the implied operating model. If a section says a signal is useful, reviewers ask where it is noisy. If a lab teaches a mitigation, reviewers ask how it fails under real user pressure. If a term sounds familiar but means different things across vendors, it moves into the glossary.
The result is deliberately plain: fewer slogans, more artifacts, and a record of why the standard teaches one trade-off before another.
Validation checklist