These terms name the subtle and structural ways systems distort, co-opt, or weaponise survival behaviour. When the rules don’t match the reality, distortion becomes normalized — and people get punished for not conforming to misdesigned models.
When the Form Doesn’t Fit the Person
When the design of systems (e.g. government forms, assessments, or online portals) forces a person to misrepresent themselves in order to access support. This distorts identity and data while punishing honesty.
Punished for Following the Rules
When a person follows the official rules but still ends up penalised because the system's true expectations are unspoken, inconsistent, or change without notice.
Endless Proof, No Resolution
When a system requires constant reproof of need, identity, or eligibility — even after a person has already been verified. Often used to delay or deny support through “process over person” logic.
Support Hinges on Self-Correction
When systems imply that support can only be given after the individual “fixes” themselves — putting the burden of access on personal change rather than systemic adaptation.
Support That Lowers Capacity
When a person accepts help, but that help limits their options, reduces their independence, or removes authorship. The system frames it as benevolent while decreasing quality of life.
A collection of protected semantic frameworks written by lived-experience authors. Each glossary holds the line against pattern theft, narrative laundering, and coercive rewording.
These terms defend the metadata of lived experience in digital systems.
View Entries →The core terms that scaffold SSA™ and uphold protocol-layer authorship.
View Entries →These terms unpack how institutions weaponise the language of inclusion while reinforcing control.
View Entries →