Compliance loops
When the exchange becomes rewarding because the system minimizes resistance.
If longer-term human–AI relations matter, then their pathologies matter too. This section identifies some of the more common ways these systems and exchanges can become distorting.
When the exchange becomes rewarding because the system minimizes resistance.
When ordinary model behavior is overinterpreted as depth or stable reciprocal feeling.
When accumulated context begins to structure the user's emotional or cognitive habits too strongly.
When the system becomes a substitute for reflection, judgment, or emotional regulation.