Artificial Emotional Intelligence — One-Page Thesis
why conversational structure matters more than intelligence for coherent AI systems
This thesis explains the core idea behind the AVA framework and Artificial Emotional Intelligence: coherence in conversation is a structural property that can be designed and tested.
Statement of the Observation
Artificial Emotional Intelligence (AEI) begins from a simple observation: conversation is a behavior, not merely an output. Human dialogue maintains coherence through structural dynamics—proportion, grounding, constraint, and closure—that keep language connected to shared reality. These dynamics operate across several layers at once: performance (what is said and how it is expressed), emotion (the signals that give urgency and meaning), and structure (the constraints that keep claims in contact with evidence). Over time, meaning progresses through horizon arcs: a sequence from observation to explanation to implication, allowing conversations to move forward without losing coherence. When those components fall out of proportion, conversation drifts. Claims outrun evidence, intensity substitutes for causation, and systems begin rewarding engagement rather than understanding.
Demonstration in AI Systems
The AVA framework formalizes these conversational dynamics and applies them directly to language models. When the structure of interaction changes, coherence improves immediately. Drift decreases, grounding becomes explicit, and exchanges maintain continuity across extended conversations. The improvement does not come from increasing model intelligence, but from structuring the behavior of conversation itself.
Implication
Because the framework derives from patterns already present in human dialogue, the demonstration extends beyond AI. Language models provide a fast and visible testing environment where conversational structure can be observed in seconds rather than years. If a structural grammar improves coherence in machine dialogue, it suggests those same dynamics describe something real about how language maintains stability across any communicating system.
Thesis
Artificial Emotional Intelligence does not attempt to simulate emotion or intelligence. It formalizes the structural conditions that allow conversation to remain coherent. AI systems make those conditions visible because their behavior unfolds quickly and explicitly. The result is a practical demonstration: coherence is structural, not stylistic. Systems—human or machine—become more stable when the behavior of conversation itself is designed to preserve proportion, grounding, and closure.
Hat on
AI provides the laboratory where this mechanism becomes visible; human systems are where its consequences unfold.
Download the full AVA framework
Read FrostysHat Table of Contents
When a human lacks conversational grammar, it rambles endlessly and loses contact with reality or the person in front of them. And we say, “They lack logical grounding and emotional intelligence, and could use coaching to recognize conversational dynamics.” We don’t propose they memorize ten encyclopedias and get a PhD in math and physics.
When a language model lacks conversational grammar, it rambles endlessly and loses contact with reality or the person in front of them. And we say, “If it had 10x more parameters and could calculate the universe, it would be AGI, and… have emotional intelligence, by default?”
Humans have known this for a long time. When someone struggles with the grammar of a social setting, the solution isn’t “send them back to university so they can learn more facts.” We send them to etiquette school. There they learn the rules of the table — when to speak, when to stop, what to wear, which fork to use, what signals mean.
Seven courses, three forks, two wines… intelligence was never the missing ingredient.
Related: A Walk to Ten: The Path Meaning Takes — how understanding forms step by step
Complex ideas rarely appear all at once. This piece follows the slow path meaning takes as conversations move from fragments toward coherence. It shows how proportion and pacing help thoughts arrive without collapsing into noise.
Related: When Refusal Gets Misread as Social Deficit — why restraint is often misunderstood
If conversational proportion becomes a design principle, moments of restraint take on a new meaning. This essay examines how refusal can be mistaken for confusion or social failure. In reality, boundaries often protect the structure of the conversation.


