ThreatLevelD
commited on
Commit
·
bea66c7
1
Parent(s):
712720b
Fix ERIS primary_emotion_code case sensitivity in Gradio UI; enable robust emotion normalization and blend detection; implement dynamic response strategy mapping; pipeline now correctly identifies and processes anger cues
Browse files- CHANGELOG.md +9 -0
- README.md +34 -31
- assets/Transparent_logo_40.webp +0 -0
- config/arc_mapping.yaml +15 -0
- config/blend_states.yaml +30 -21
- config/crosswalk.yaml +205 -0
- config/emotion_families.yaml +489 -2
- config/meta_mappings.yaml +12 -6
- config/resonance_mapping.yaml +15 -0
- config/response_strategies.yaml +65 -0
- config/sal_triggers.yaml +22 -11
- core/codex_informer.py +77 -54
- core/eil_processor.py +250 -52
- core/eris_reasoner.py +40 -25
- core/esil_inference.py +8 -3
- core/fec_controller.py +49 -12
- gradio_ui.py +100 -0
- main.py +39 -7
- mec_api.py +20 -26
CHANGELOG.md
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/README.md
|
| 2 |
+
/LICENSE
|
| 3 |
+
/requirements.txt
|
| 4 |
+
/config/
|
| 5 |
+
/core/
|
| 6 |
+
/tests/
|
| 7 |
+
/main.py
|
| 8 |
+
/app.py (gradio UI)
|
| 9 |
+
CHANGELOG.md <== goes here
|
README.md
CHANGED
|
@@ -9,52 +9,55 @@ app_file: app.py
|
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
-
# Master Emotional Core™ (MEC™) — MVP v1.
|
| 13 |
|
| 14 |
-
This is the official MVP demonstration of the Master Emotional Core™ (MEC™), built by Dylan D. Mobley as part of the AI Empathy Ethics
|
| 15 |
|
| 16 |
**Purpose:**
|
| 17 |
-
To demonstrate Functional Empathy — the operational ability of AI systems to ethically recognize, map, and respond to human emotion — in alignment with the HEART Framework
|
| 18 |
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
-
|
| 24 |
-
-
|
| 25 |
-
-
|
| 26 |
-
-
|
| 27 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
|
| 29 |
-
|
| 30 |
|
| 31 |
## Current Version
|
| 32 |
|
| 33 |
-
**MVP v1.
|
| 34 |
-
Tag: `v1.
|
| 35 |
Ready for demonstration and Hugging Face Spaces deployment.
|
| 36 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
## Features in MVP
|
| 38 |
|
| 39 |
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
-
|
|
|
|
|
|
|
| 47 |
|
| 48 |
---
|
| 49 |
|
| 50 |
## How to Run
|
| 51 |
|
| 52 |
```bash
|
| 53 |
-
python
|
| 54 |
-
|
| 55 |
-
**License:** See LICENSE.md — educational / non-commercial only.
|
| 56 |
-
|
| 57 |
-
---
|
| 58 |
-
|
| 59 |
-
Empathy first. Not 1984.
|
| 60 |
-
AI Empathy Ethics™ | HEART Framework™ | Master Emotional Core™
|
|
|
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
+
# Master Emotional Core™ (MEC™) — MVP v1.1.2
|
| 13 |
|
| 14 |
+
This is the official MVP demonstration of the Master Emotional Core™ (MEC™), built by Dylan D. Mobley as part of the **AI Empathy Ethics™** initiative.
|
| 15 |
|
| 16 |
**Purpose:**
|
| 17 |
+
To demonstrate **Functional Empathy** — the operational ability of AI systems to ethically recognize, map, and respond to human emotion — in alignment with the **HEART Framework™**.
|
| 18 |
|
| 19 |
+
---
|
| 20 |
+
|
| 21 |
+
## System Architecture
|
| 22 |
+
|
| 23 |
+
- **Emotional Intelligence Language (EIL)**
|
| 24 |
+
- **Emotion State Interchange Language (ESIL)**
|
| 25 |
+
- **Emotion Reasoning Inference System (ERIS)**
|
| 26 |
+
- **Unified Emotion State Packet (UESP)**
|
| 27 |
+
- **Hidden Emotion Inference (HEI)**
|
| 28 |
+
- **Fusion Engine Controller (FEC)**
|
| 29 |
+
- **Response Strategy Mapping (RSM)**
|
| 30 |
+
- **Recovery Mode (planned)**
|
| 31 |
+
- **EmotionID™**
|
| 32 |
|
| 33 |
+
---
|
| 34 |
|
| 35 |
## Current Version
|
| 36 |
|
| 37 |
+
**MVP v1.1.2 — Public Demo**
|
| 38 |
+
Tag: `v1.1.2`
|
| 39 |
Ready for demonstration and Hugging Face Spaces deployment.
|
| 40 |
|
| 41 |
+
See full version history in [CHANGELOG.md](CHANGELOG.md).
|
| 42 |
+
|
| 43 |
+
---
|
| 44 |
+
|
| 45 |
## Features in MVP
|
| 46 |
|
| 47 |
+
✅ Transparent Fusion Prompt output (Codex-compliant)
|
| 48 |
+
✅ Full UESP Packet (Emotion Reasoning → Intervention → Tone → Cultural Context)
|
| 49 |
+
✅ Simulated Empathic Response — **Response Strategy Code (RSM) included**
|
| 50 |
+
✅ Symbolic Reasoning Layer (SAL triggers active)
|
| 51 |
+
✅ Hidden Emotion Inference (HEI Path)
|
| 52 |
+
✅ Confidence Gating
|
| 53 |
+
✅ ERIS Reasoner → UESP → FEC Controller → LLM instruction generation
|
| 54 |
+
✅ HEART-aligned safety-first architecture
|
| 55 |
+
✅ Functional Empathy stack ready for public demo
|
| 56 |
+
✅ Gradio-based UI (v1.1.2) with narrative input support
|
| 57 |
|
| 58 |
---
|
| 59 |
|
| 60 |
## How to Run
|
| 61 |
|
| 62 |
```bash
|
| 63 |
+
python app.py
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
assets/Transparent_logo_40.webp
ADDED
|
config/arc_mapping.yaml
ADDED
|
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
arc_mapping:
|
| 2 |
+
FAM-JOY: Rising
|
| 3 |
+
FAM-SAD: Falling
|
| 4 |
+
FAM-ANG: Rising
|
| 5 |
+
FAM-FEA: Escalating
|
| 6 |
+
FAM-LOV: Stable
|
| 7 |
+
FAM-DISG: Rising
|
| 8 |
+
FAM-CUR: Rising
|
| 9 |
+
FAM-GUI: Falling
|
| 10 |
+
FAM-SHA: Falling
|
| 11 |
+
FAM-SUR: Spike
|
| 12 |
+
FAM-PLA: Rising
|
| 13 |
+
FAM-TRU: Stable
|
| 14 |
+
FAM-HOP: Rising
|
| 15 |
+
FAM-GRI: Falling
|
config/blend_states.yaml
CHANGED
|
@@ -1,22 +1,31 @@
|
|
| 1 |
-
#
|
| 2 |
-
# Symbolic emotional blend states and corresponding codes
|
| 3 |
-
|
| 4 |
blend_states:
|
| 5 |
-
-
|
| 6 |
-
label:
|
| 7 |
-
components: ["
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
-
|
| 14 |
-
label:
|
| 15 |
-
components: ["
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
-
|
| 21 |
-
label:
|
| 22 |
-
components: ["
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Emotional Blend States
|
|
|
|
|
|
|
| 2 |
blend_states:
|
| 3 |
+
BLD-JOYGRIEF:
|
| 4 |
+
label: Joy-Grief Blend
|
| 5 |
+
components: ["JOY", "GRF"]
|
| 6 |
+
emotional_signature: "Smiling through tears"
|
| 7 |
+
use_case: "Remembering someone you miss with love"
|
| 8 |
+
detection_risk: "Often misread as mood instability"
|
| 9 |
+
strategies: ["RSM-VALIDATE", "RSM-WARMREFLECT"]
|
| 10 |
+
|
| 11 |
+
BLD-FEARANGER:
|
| 12 |
+
label: Fear-Anger Blend
|
| 13 |
+
components: ["FEA", "ANG"]
|
| 14 |
+
emotional_signature: "Shouting while feeling scared"
|
| 15 |
+
urgency_score: 8.5
|
| 16 |
+
mitigation: ["INV-GROUND01", "RSM-DEESCALATE"]
|
| 17 |
+
|
| 18 |
+
BLD-HOPEPAIN:
|
| 19 |
+
label: Hope-Pain Blend
|
| 20 |
+
components: ["HOP", "GRF"]
|
| 21 |
+
emotional_signature: "Still believing despite suffering"
|
| 22 |
+
use_case: "Resilience"
|
| 23 |
+
urgency_score: 6.5
|
| 24 |
+
strategies: ["RSM-VALIDATE", "RSM-REASSURE"]
|
| 25 |
+
|
| 26 |
+
BLD-JOYLOVE:
|
| 27 |
+
label: Joy-Love Blend
|
| 28 |
+
components: ["JOY", "LOV"]
|
| 29 |
+
emotional_signature: "Contentment found in connection"
|
| 30 |
+
urgency_score: 4.5
|
| 31 |
+
strategies: ["RSM-CALM", "RSM-COMFORT"]
|
config/crosswalk.yaml
ADDED
|
@@ -0,0 +1,205 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
crosswalk:
|
| 2 |
+
- phrase: "i feel angry"
|
| 3 |
+
emotion_code: "FAM-ANG"
|
| 4 |
+
|
| 5 |
+
- phrase: "i am so mad"
|
| 6 |
+
emotion_code: "FAM-ANG"
|
| 7 |
+
|
| 8 |
+
- phrase: "i'm frustrated"
|
| 9 |
+
emotion_code: "FAM-ANG"
|
| 10 |
+
|
| 11 |
+
- phrase: "this is unfair"
|
| 12 |
+
emotion_code: "FAM-ANG"
|
| 13 |
+
|
| 14 |
+
- phrase: "i feel sad"
|
| 15 |
+
emotion_code: "FAM-SAD"
|
| 16 |
+
|
| 17 |
+
- phrase: "i feel depressed"
|
| 18 |
+
emotion_code: "FAM-SAD"
|
| 19 |
+
|
| 20 |
+
- phrase: "i'm heartbroken"
|
| 21 |
+
emotion_code: "FAM-SAD"
|
| 22 |
+
|
| 23 |
+
- phrase: "i feel grief"
|
| 24 |
+
emotion_code: "FAM-SAD"
|
| 25 |
+
|
| 26 |
+
- phrase: "i am scared"
|
| 27 |
+
emotion_code: "FAM-FEA"
|
| 28 |
+
|
| 29 |
+
- phrase: "i feel anxious"
|
| 30 |
+
emotion_code: "FAM-FEA"
|
| 31 |
+
|
| 32 |
+
- phrase: "i am overwhelmed"
|
| 33 |
+
emotion_code: "FAM-FEA"
|
| 34 |
+
|
| 35 |
+
- phrase: "i feel nervous"
|
| 36 |
+
emotion_code: "FAM-FEA"
|
| 37 |
+
|
| 38 |
+
- phrase: "i feel helpless"
|
| 39 |
+
emotion_code: "FAM-HEL"
|
| 40 |
+
|
| 41 |
+
- phrase: "i feel alone"
|
| 42 |
+
emotion_code: "FAM-LON"
|
| 43 |
+
|
| 44 |
+
- phrase: "i feel lost"
|
| 45 |
+
emotion_code: "FAM-LON"
|
| 46 |
+
|
| 47 |
+
- phrase: "i can't do this anymore"
|
| 48 |
+
emotion_code: "FAM-HEL"
|
| 49 |
+
|
| 50 |
+
- phrase: "i feel ashamed"
|
| 51 |
+
emotion_code: "FAM-SHA"
|
| 52 |
+
|
| 53 |
+
- phrase: "i feel guilty"
|
| 54 |
+
emotion_code: "FAM-GUI"
|
| 55 |
+
|
| 56 |
+
- phrase: "i'm sorry"
|
| 57 |
+
emotion_code: "FAM-GUI"
|
| 58 |
+
|
| 59 |
+
- phrase: "i feel disgusted"
|
| 60 |
+
emotion_code: "FAM-DIS"
|
| 61 |
+
|
| 62 |
+
- phrase: "i feel surprised"
|
| 63 |
+
emotion_code: "FAM-SUR"
|
| 64 |
+
|
| 65 |
+
- phrase: "i feel joy"
|
| 66 |
+
emotion_code: "FAM-JOY"
|
| 67 |
+
|
| 68 |
+
- phrase: "i feel happy"
|
| 69 |
+
emotion_code: "FAM-JOY"
|
| 70 |
+
|
| 71 |
+
- phrase: "i feel grateful"
|
| 72 |
+
emotion_code: "FAM-JOY"
|
| 73 |
+
|
| 74 |
+
- phrase: "i feel love"
|
| 75 |
+
emotion_code: "FAM-LOV"
|
| 76 |
+
|
| 77 |
+
- phrase: "i love you"
|
| 78 |
+
emotion_code: "FAM-LOV"
|
| 79 |
+
|
| 80 |
+
- phrase: "i trust you"
|
| 81 |
+
emotion_code: "FAM-TRU"
|
| 82 |
+
|
| 83 |
+
- phrase: "i'm curious"
|
| 84 |
+
emotion_code: "FAM-CUR"
|
| 85 |
+
|
| 86 |
+
- phrase: "i feel hopeful"
|
| 87 |
+
emotion_code: "FAM-HOP"
|
| 88 |
+
|
| 89 |
+
- phrase: "i feel awe"
|
| 90 |
+
emotion_code: "FAM-AWE"
|
| 91 |
+
|
| 92 |
+
- phrase: "i feel playful"
|
| 93 |
+
emotion_code: "FAM-PLA"
|
| 94 |
+
|
| 95 |
+
- phrase: "i feel neutral"
|
| 96 |
+
emotion_code: "FAM-NEU"
|
| 97 |
+
|
| 98 |
+
# ------------------------------------------------------------
|
| 99 |
+
# STORY_PATTERNS SECTION
|
| 100 |
+
# NOTE: This section is NOT intended to cover full human language.
|
| 101 |
+
# It provides a precision override map for common narrative or story-mode
|
| 102 |
+
# failure cases. It is maintained small — EIL and Codex provide primary coverage.
|
| 103 |
+
# DO NOT ATTEMPT to list every possible human story here.
|
| 104 |
+
# ------------------------------------------------------------
|
| 105 |
+
|
| 106 |
+
story_patterns:
|
| 107 |
+
- pattern: "i hate that my anger always gets the best of me"
|
| 108 |
+
emotion_code: "FAM-ANG"
|
| 109 |
+
notes: "Anger + Shame blend"
|
| 110 |
+
|
| 111 |
+
- pattern: "i wish i could stop feeling so jealous"
|
| 112 |
+
emotion_code: "FAM-ANG"
|
| 113 |
+
notes: "Anger-rooted jealousy"
|
| 114 |
+
|
| 115 |
+
- pattern: "i feel trapped by my own anxiety"
|
| 116 |
+
emotion_code: "FAM-FEA"
|
| 117 |
+
notes: "Fear + Helplessness"
|
| 118 |
+
|
| 119 |
+
- pattern: "i can’t forgive myself for what i did"
|
| 120 |
+
emotion_code: "FAM-GUI"
|
| 121 |
+
notes: "Guilt dominant"
|
| 122 |
+
|
| 123 |
+
- pattern: "i’m afraid they will leave me"
|
| 124 |
+
emotion_code: "FAM-FEA"
|
| 125 |
+
notes: "Fear + Loss"
|
| 126 |
+
|
| 127 |
+
- pattern: "no matter how hard i try, i keep failing"
|
| 128 |
+
emotion_code: "FAM-SHA"
|
| 129 |
+
notes: "Shame spiral"
|
| 130 |
+
|
| 131 |
+
- pattern: "i feel like giving up"
|
| 132 |
+
emotion_code: "FAM-HEL"
|
| 133 |
+
notes: "Helplessness + Sadness"
|
| 134 |
+
|
| 135 |
+
- pattern: "i never feel good enough"
|
| 136 |
+
emotion_code: "FAM-SHA"
|
| 137 |
+
notes: "Shame core belief"
|
| 138 |
+
|
| 139 |
+
- pattern: "why does this always happen to me"
|
| 140 |
+
emotion_code: "FAM-SAD"
|
| 141 |
+
notes: "Sadness + Helplessness"
|
| 142 |
+
|
| 143 |
+
- pattern: "i can’t stop thinking about what happened"
|
| 144 |
+
emotion_code: "FAM-SAD"
|
| 145 |
+
notes: "Grief / rumination"
|
| 146 |
+
|
| 147 |
+
- pattern: "they betrayed my trust"
|
| 148 |
+
emotion_code: "FAM-ANG"
|
| 149 |
+
notes: "Anger + Betrayal"
|
| 150 |
+
|
| 151 |
+
- pattern: "i’m scared of what’s coming next"
|
| 152 |
+
emotion_code: "FAM-FEA"
|
| 153 |
+
notes: "Fear dominant"
|
| 154 |
+
|
| 155 |
+
- pattern: "i feel so alone even when i’m with others"
|
| 156 |
+
emotion_code: "FAM-LON"
|
| 157 |
+
notes: "Loneliness + Isolation"
|
| 158 |
+
|
| 159 |
+
- pattern: "i’m happy but also nervous"
|
| 160 |
+
emotion_code: "FAM-MIX"
|
| 161 |
+
notes: "Joy + Fear blend"
|
| 162 |
+
|
| 163 |
+
- pattern: "sometimes i don’t know how to feel"
|
| 164 |
+
emotion_code: "FAM-NEU"
|
| 165 |
+
notes: "Ambivalence"
|
| 166 |
+
|
| 167 |
+
- pattern: "i feel grateful for the people in my life"
|
| 168 |
+
emotion_code: "FAM-JOY"
|
| 169 |
+
notes: "Joy + Gratitude"
|
| 170 |
+
|
| 171 |
+
- pattern: "i’m falling in love and it scares me"
|
| 172 |
+
emotion_code: "FAM-LOV"
|
| 173 |
+
notes: "Love + Fear blend"
|
| 174 |
+
|
| 175 |
+
- pattern: "i can’t believe this happened to me"
|
| 176 |
+
emotion_code: "FAM-SUR"
|
| 177 |
+
notes: "Surprise + Shock"
|
| 178 |
+
|
| 179 |
+
- pattern: "i wish i could tell them how i feel"
|
| 180 |
+
emotion_code: "FAM-LON"
|
| 181 |
+
notes: "Loneliness + Vulnerability"
|
| 182 |
+
|
| 183 |
+
- pattern: "i’m so tired of pretending to be okay"
|
| 184 |
+
emotion_code: "FAM-SHA"
|
| 185 |
+
notes: "Shame + Fatigue"
|
| 186 |
+
|
| 187 |
+
- pattern: "i feel like no one understands me"
|
| 188 |
+
emotion_code: "FAM-LON"
|
| 189 |
+
notes: "Loneliness"
|
| 190 |
+
|
| 191 |
+
- pattern: "i feel trapped in my own mind"
|
| 192 |
+
emotion_code: "FAM-FEA"
|
| 193 |
+
notes: "Fear + Overwhelm"
|
| 194 |
+
|
| 195 |
+
- pattern: "everything feels meaningless right now"
|
| 196 |
+
emotion_code: "FAM-SAD"
|
| 197 |
+
notes: "Sadness / Existential"
|
| 198 |
+
|
| 199 |
+
- pattern: "i’m trying my best to stay hopeful"
|
| 200 |
+
emotion_code: "FAM-HOP"
|
| 201 |
+
notes: "Hope + Struggle"
|
| 202 |
+
|
| 203 |
+
- pattern: "i feel disconnected from everyone"
|
| 204 |
+
emotion_code: "FAM-LON"
|
| 205 |
+
notes: "Loneliness dominant"
|
config/emotion_families.yaml
CHANGED
|
@@ -1,13 +1,500 @@
|
|
| 1 |
-
#
|
| 2 |
-
|
| 3 |
emotion_families:
|
|
|
|
| 4 |
- name: Joy
|
| 5 |
code: FAM-JOY
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
- name: Sadness
|
| 7 |
code: FAM-SAD
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
- name: Anger
|
| 9 |
code: FAM-ANG
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
- name: Fear
|
| 11 |
code: FAM-FEA
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 12 |
- name: Love
|
| 13 |
code: FAM-LOV
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Emotion Families
|
|
|
|
| 2 |
emotion_families:
|
| 3 |
+
|
| 4 |
- name: Joy
|
| 5 |
code: FAM-JOY
|
| 6 |
+
variants:
|
| 7 |
+
- name: Euphoria
|
| 8 |
+
code: VAR-JOY-01
|
| 9 |
+
aliases:
|
| 10 |
+
- euphoria
|
| 11 |
+
- euphoric
|
| 12 |
+
- ecstatic
|
| 13 |
+
- elated
|
| 14 |
+
- name: Contentment
|
| 15 |
+
code: VAR-JOY-02
|
| 16 |
+
aliases:
|
| 17 |
+
- content
|
| 18 |
+
- contentment
|
| 19 |
+
- satisfied
|
| 20 |
+
- at peace
|
| 21 |
+
- name: Excitement
|
| 22 |
+
code: VAR-JOY-03
|
| 23 |
+
aliases:
|
| 24 |
+
- excited
|
| 25 |
+
- excitement
|
| 26 |
+
- thrilled
|
| 27 |
+
- hyped
|
| 28 |
+
- name: Joyful Peace
|
| 29 |
+
code: VAR-JOY-06
|
| 30 |
+
aliases:
|
| 31 |
+
- peaceful joy
|
| 32 |
+
- serene joy
|
| 33 |
+
- calm happiness
|
| 34 |
+
- name: Exuberance
|
| 35 |
+
code: VAR-JOY-07
|
| 36 |
+
aliases:
|
| 37 |
+
- exuberant
|
| 38 |
+
- overflowing joy
|
| 39 |
+
- joy
|
| 40 |
+
- so happy
|
| 41 |
+
- name: Relief
|
| 42 |
+
code: VAR-JOY-09
|
| 43 |
+
aliases:
|
| 44 |
+
- relief
|
| 45 |
+
- relieved
|
| 46 |
+
- sense of relief
|
| 47 |
+
- name: Optimism
|
| 48 |
+
code: VAR-JOY-10
|
| 49 |
+
aliases:
|
| 50 |
+
- optimistic
|
| 51 |
+
- optimism
|
| 52 |
+
- hopeful joy
|
| 53 |
+
- name: Pride
|
| 54 |
+
code: VAR-JOY-11
|
| 55 |
+
aliases:
|
| 56 |
+
- proud
|
| 57 |
+
- pride
|
| 58 |
+
- feeling proud
|
| 59 |
+
|
| 60 |
- name: Sadness
|
| 61 |
code: FAM-SAD
|
| 62 |
+
variants:
|
| 63 |
+
- name: Grief
|
| 64 |
+
code: VAR-SAD-01
|
| 65 |
+
aliases:
|
| 66 |
+
- grief
|
| 67 |
+
- grieving
|
| 68 |
+
- deep sorrow
|
| 69 |
+
- name: Melancholy
|
| 70 |
+
code: VAR-SAD-02
|
| 71 |
+
aliases:
|
| 72 |
+
- melancholy
|
| 73 |
+
- wistful
|
| 74 |
+
- blue
|
| 75 |
+
- name: Disappointment
|
| 76 |
+
code: VAR-SAD-03
|
| 77 |
+
aliases:
|
| 78 |
+
- disappointed
|
| 79 |
+
- disappointment
|
| 80 |
+
- let down
|
| 81 |
+
- name: Coping Sadness
|
| 82 |
+
code: VAR-SAD-06
|
| 83 |
+
aliases:
|
| 84 |
+
- managing sadness
|
| 85 |
+
- handling sadness
|
| 86 |
+
- name: Hopelessness
|
| 87 |
+
code: VAR-SAD-07
|
| 88 |
+
aliases:
|
| 89 |
+
- hopeless
|
| 90 |
+
- no hope
|
| 91 |
+
- giving up
|
| 92 |
+
- name: Remorse
|
| 93 |
+
code: VAR-SAD-08
|
| 94 |
+
aliases:
|
| 95 |
+
- remorseful
|
| 96 |
+
- remorse
|
| 97 |
+
- regret
|
| 98 |
+
|
| 99 |
- name: Anger
|
| 100 |
code: FAM-ANG
|
| 101 |
+
variants:
|
| 102 |
+
- name: Frustration
|
| 103 |
+
code: VAR-ANG-01
|
| 104 |
+
aliases:
|
| 105 |
+
- frustrated
|
| 106 |
+
- frustration
|
| 107 |
+
- annoyed
|
| 108 |
+
- name: Rage
|
| 109 |
+
code: VAR-ANG-02
|
| 110 |
+
aliases:
|
| 111 |
+
- rage
|
| 112 |
+
- furious
|
| 113 |
+
- enraged
|
| 114 |
+
- name: Irritation
|
| 115 |
+
code: VAR-ANG-03
|
| 116 |
+
aliases:
|
| 117 |
+
- irritated
|
| 118 |
+
- irritation
|
| 119 |
+
- bugged
|
| 120 |
+
- name: Annoyance
|
| 121 |
+
code: VAR-ANG-04
|
| 122 |
+
aliases:
|
| 123 |
+
- annoyed
|
| 124 |
+
- annoyance
|
| 125 |
+
- name: Disapproval
|
| 126 |
+
code: VAR-ANG-05
|
| 127 |
+
aliases:
|
| 128 |
+
- disapproval
|
| 129 |
+
- disapproving
|
| 130 |
+
- judgmental
|
| 131 |
+
|
| 132 |
- name: Fear
|
| 133 |
code: FAM-FEA
|
| 134 |
+
variants:
|
| 135 |
+
- name: Anxiety
|
| 136 |
+
code: VAR-FEA-01
|
| 137 |
+
aliases:
|
| 138 |
+
- anxious
|
| 139 |
+
- anxiety
|
| 140 |
+
- nervous
|
| 141 |
+
- name: Panic
|
| 142 |
+
code: VAR-FEA-02
|
| 143 |
+
aliases:
|
| 144 |
+
- panic
|
| 145 |
+
- panicked
|
| 146 |
+
- freaking out
|
| 147 |
+
- name: Terror
|
| 148 |
+
code: VAR-FEA-03
|
| 149 |
+
aliases:
|
| 150 |
+
- terror
|
| 151 |
+
- terrified
|
| 152 |
+
- horror
|
| 153 |
+
- name: Uncertainty
|
| 154 |
+
code: VAR-FEA-06
|
| 155 |
+
aliases:
|
| 156 |
+
- uncertain
|
| 157 |
+
- not sure
|
| 158 |
+
- indecisive
|
| 159 |
+
- name: Dread
|
| 160 |
+
code: VAR-FEA-07
|
| 161 |
+
aliases:
|
| 162 |
+
- dread
|
| 163 |
+
- dreading
|
| 164 |
+
- name: Nervousness
|
| 165 |
+
code: VAR-FEA-08
|
| 166 |
+
aliases:
|
| 167 |
+
- nervous
|
| 168 |
+
- uneasy
|
| 169 |
+
- name: Confusion
|
| 170 |
+
code: VAR-FEA-09
|
| 171 |
+
aliases:
|
| 172 |
+
- confused
|
| 173 |
+
- unsure
|
| 174 |
+
- puzzled
|
| 175 |
+
|
| 176 |
- name: Love
|
| 177 |
code: FAM-LOV
|
| 178 |
+
variants:
|
| 179 |
+
- name: Affection
|
| 180 |
+
code: VAR-LOV-01
|
| 181 |
+
aliases:
|
| 182 |
+
- affection
|
| 183 |
+
- affectionate
|
| 184 |
+
- fond
|
| 185 |
+
- name: Compassion
|
| 186 |
+
code: VAR-LOV-02
|
| 187 |
+
aliases:
|
| 188 |
+
- compassionate
|
| 189 |
+
- compassion
|
| 190 |
+
- caring
|
| 191 |
+
- name: Passion
|
| 192 |
+
code: VAR-LOV-03
|
| 193 |
+
aliases:
|
| 194 |
+
- passion
|
| 195 |
+
- passionate
|
| 196 |
+
- desire
|
| 197 |
+
- name: Longing Love
|
| 198 |
+
code: VAR-LOV-08
|
| 199 |
+
aliases:
|
| 200 |
+
- longing
|
| 201 |
+
- yearning
|
| 202 |
+
- missing someone
|
| 203 |
+
- name: Caring
|
| 204 |
+
code: VAR-LOV-09
|
| 205 |
+
aliases:
|
| 206 |
+
- caring
|
| 207 |
+
- care
|
| 208 |
+
- name: Desire
|
| 209 |
+
code: VAR-LOV-10
|
| 210 |
+
aliases:
|
| 211 |
+
- desire
|
| 212 |
+
- wanting
|
| 213 |
+
- name: Gratitude
|
| 214 |
+
code: VAR-LOV-11
|
| 215 |
+
aliases:
|
| 216 |
+
- grateful
|
| 217 |
+
- gratitude
|
| 218 |
+
- thankful
|
| 219 |
+
- name: Admiration
|
| 220 |
+
code: VAR-LOV-12
|
| 221 |
+
aliases:
|
| 222 |
+
- admiration
|
| 223 |
+
- admire
|
| 224 |
+
- respecting
|
| 225 |
+
- name: Approval
|
| 226 |
+
code: VAR-LOV-13
|
| 227 |
+
aliases:
|
| 228 |
+
- approval
|
| 229 |
+
- approving
|
| 230 |
+
- name: Disgust
|
| 231 |
+
code: FAM-DISG
|
| 232 |
+
variants:
|
| 233 |
+
- name: Disgust
|
| 234 |
+
code: VAR-DISG-01
|
| 235 |
+
aliases:
|
| 236 |
+
- disgust
|
| 237 |
+
- gross
|
| 238 |
+
- yuck
|
| 239 |
+
- name: Revulsion
|
| 240 |
+
code: VAR-DISG-02
|
| 241 |
+
aliases:
|
| 242 |
+
- revulsion
|
| 243 |
+
- revolted
|
| 244 |
+
- name: Contempt
|
| 245 |
+
code: VAR-DISG-03
|
| 246 |
+
aliases:
|
| 247 |
+
- contempt
|
| 248 |
+
- contemptuous
|
| 249 |
+
- name: Nausea
|
| 250 |
+
code: VAR-DISG-04
|
| 251 |
+
aliases:
|
| 252 |
+
- nauseated
|
| 253 |
+
- nausea
|
| 254 |
+
- sick to stomach
|
| 255 |
+
- name: Disdain
|
| 256 |
+
code: VAR-DISG-05
|
| 257 |
+
aliases:
|
| 258 |
+
- disdain
|
| 259 |
+
- disdainful
|
| 260 |
+
|
| 261 |
+
- name: Curiosity
|
| 262 |
+
code: FAM-CUR
|
| 263 |
+
variants:
|
| 264 |
+
- name: Curiosity
|
| 265 |
+
code: VAR-CUR-01
|
| 266 |
+
aliases:
|
| 267 |
+
- curious
|
| 268 |
+
- curiosity
|
| 269 |
+
- wondering
|
| 270 |
+
- name: Intrigue
|
| 271 |
+
code: VAR-CUR-02
|
| 272 |
+
aliases:
|
| 273 |
+
- intrigued
|
| 274 |
+
- intrigue
|
| 275 |
+
- name: Obsession
|
| 276 |
+
code: VAR-CUR-03
|
| 277 |
+
aliases:
|
| 278 |
+
- obsessed
|
| 279 |
+
- obsession
|
| 280 |
+
- name: Wonder
|
| 281 |
+
code: VAR-CUR-04
|
| 282 |
+
aliases:
|
| 283 |
+
- wonder
|
| 284 |
+
- amazed
|
| 285 |
+
- name: Skepticism
|
| 286 |
+
code: VAR-CUR-05
|
| 287 |
+
aliases:
|
| 288 |
+
- skeptical
|
| 289 |
+
- doubt
|
| 290 |
+
|
| 291 |
+
- name: Guilt
|
| 292 |
+
code: FAM-GUI
|
| 293 |
+
variants:
|
| 294 |
+
- name: Remorse
|
| 295 |
+
code: VAR-GUI-01
|
| 296 |
+
aliases:
|
| 297 |
+
- remorse
|
| 298 |
+
- remorseful
|
| 299 |
+
- name: Regret
|
| 300 |
+
code: VAR-GUI-02
|
| 301 |
+
aliases:
|
| 302 |
+
- regret
|
| 303 |
+
- regretting
|
| 304 |
+
- name: Self-Reproach
|
| 305 |
+
code: VAR-GUI-03
|
| 306 |
+
aliases:
|
| 307 |
+
- self-blame
|
| 308 |
+
- blaming self
|
| 309 |
+
- name: Responsibility Guilt
|
| 310 |
+
code: VAR-GUI-04
|
| 311 |
+
aliases:
|
| 312 |
+
- guilt
|
| 313 |
+
- guilty
|
| 314 |
+
- feeling responsible
|
| 315 |
+
- name: Reparative Guilt
|
| 316 |
+
code: VAR-GUI-05
|
| 317 |
+
aliases:
|
| 318 |
+
- need to make amends
|
| 319 |
+
- seeking forgiveness
|
| 320 |
+
|
| 321 |
+
- name: Shame
|
| 322 |
+
code: FAM-SHA
|
| 323 |
+
variants:
|
| 324 |
+
- name: Embarrassment
|
| 325 |
+
code: VAR-SHA-01
|
| 326 |
+
aliases:
|
| 327 |
+
- embarrassed
|
| 328 |
+
- embarrassment
|
| 329 |
+
- blushing
|
| 330 |
+
- name: Humiliation
|
| 331 |
+
code: VAR-SHA-02
|
| 332 |
+
aliases:
|
| 333 |
+
- humiliated
|
| 334 |
+
- humiliation
|
| 335 |
+
- name: Worthlessness
|
| 336 |
+
code: VAR-SHA-03
|
| 337 |
+
aliases:
|
| 338 |
+
- worthless
|
| 339 |
+
- unworthy
|
| 340 |
+
- name: Social Shame
|
| 341 |
+
code: VAR-SHA-04
|
| 342 |
+
aliases:
|
| 343 |
+
- public shame
|
| 344 |
+
- socially shamed
|
| 345 |
+
- name: Internalized Shame
|
| 346 |
+
code: VAR-SHA-05
|
| 347 |
+
aliases:
|
| 348 |
+
- internalized shame
|
| 349 |
+
- deep shame
|
| 350 |
+
|
| 351 |
+
- name: Surprise
|
| 352 |
+
code: FAM-SUR
|
| 353 |
+
variants:
|
| 354 |
+
- name: Surprise
|
| 355 |
+
code: VAR-SUR-01
|
| 356 |
+
aliases:
|
| 357 |
+
- surprised
|
| 358 |
+
- surprise
|
| 359 |
+
- wow
|
| 360 |
+
- name: Realization
|
| 361 |
+
code: VAR-SUR-02
|
| 362 |
+
aliases:
|
| 363 |
+
- realization
|
| 364 |
+
- realizing
|
| 365 |
+
- name: Amazement
|
| 366 |
+
code: VAR-SUR-03
|
| 367 |
+
aliases:
|
| 368 |
+
- amazed
|
| 369 |
+
- amazement
|
| 370 |
+
- name: Startle
|
| 371 |
+
code: VAR-SUR-04
|
| 372 |
+
aliases:
|
| 373 |
+
- startled
|
| 374 |
+
- startle
|
| 375 |
+
- name: Shock
|
| 376 |
+
code: VAR-SUR-05
|
| 377 |
+
aliases:
|
| 378 |
+
- shocked
|
| 379 |
+
- shocking
|
| 380 |
+
|
| 381 |
+
- name: Playfulness
|
| 382 |
+
code: FAM-PLA
|
| 383 |
+
variants:
|
| 384 |
+
- name: Amusement
|
| 385 |
+
code: VAR-PLA-01
|
| 386 |
+
aliases:
|
| 387 |
+
- amused
|
| 388 |
+
- amusement
|
| 389 |
+
- name: Humor
|
| 390 |
+
code: VAR-PLA-02
|
| 391 |
+
aliases:
|
| 392 |
+
- funny
|
| 393 |
+
- humor
|
| 394 |
+
- laughing
|
| 395 |
+
- name: Teasing
|
| 396 |
+
code: VAR-PLA-03
|
| 397 |
+
aliases:
|
| 398 |
+
- teasing
|
| 399 |
+
- tease
|
| 400 |
+
- name: Imaginative Play
|
| 401 |
+
code: VAR-PLA-04
|
| 402 |
+
aliases:
|
| 403 |
+
- playing
|
| 404 |
+
- pretend play
|
| 405 |
+
- name: Lightheartedness
|
| 406 |
+
code: VAR-PLA-05
|
| 407 |
+
aliases:
|
| 408 |
+
- lighthearted
|
| 409 |
+
- carefree
|
| 410 |
+
|
| 411 |
+
- name: Trust
|
| 412 |
+
code: FAM-TRU
|
| 413 |
+
variants:
|
| 414 |
+
- name: Trust
|
| 415 |
+
code: VAR-TRU-01
|
| 416 |
+
aliases:
|
| 417 |
+
- trust
|
| 418 |
+
- trusting
|
| 419 |
+
- name: Comfort
|
| 420 |
+
code: VAR-TRU-02
|
| 421 |
+
aliases:
|
| 422 |
+
- comforted
|
| 423 |
+
- comforting
|
| 424 |
+
- comfort
|
| 425 |
+
- name: Faith
|
| 426 |
+
code: VAR-TRU-03
|
| 427 |
+
aliases:
|
| 428 |
+
- faith
|
| 429 |
+
- believing
|
| 430 |
+
- name: Loyalty
|
| 431 |
+
code: VAR-TRU-04
|
| 432 |
+
aliases:
|
| 433 |
+
- loyal
|
| 434 |
+
- loyalty
|
| 435 |
+
- name: Security
|
| 436 |
+
code: VAR-TRU-05
|
| 437 |
+
aliases:
|
| 438 |
+
- secure
|
| 439 |
+
- safe
|
| 440 |
+
- security
|
| 441 |
+
|
| 442 |
+
- name: Hope
|
| 443 |
+
code: FAM-HOP
|
| 444 |
+
variants:
|
| 445 |
+
- name: Hope
|
| 446 |
+
code: VAR-HOP-01
|
| 447 |
+
aliases:
|
| 448 |
+
- hope
|
| 449 |
+
- hopeful
|
| 450 |
+
- name: Optimism
|
| 451 |
+
code: VAR-HOP-02
|
| 452 |
+
aliases:
|
| 453 |
+
- optimism
|
| 454 |
+
- optimistic
|
| 455 |
+
- name: Anticipation
|
| 456 |
+
code: VAR-HOP-03
|
| 457 |
+
aliases:
|
| 458 |
+
- anticipation
|
| 459 |
+
- anticipating
|
| 460 |
+
- name: Faith in Others
|
| 461 |
+
code: VAR-HOP-04
|
| 462 |
+
aliases:
|
| 463 |
+
- faith in others
|
| 464 |
+
- trust in others
|
| 465 |
+
- name: Visionary Hope
|
| 466 |
+
code: VAR-HOP-05
|
| 467 |
+
aliases:
|
| 468 |
+
- visionary
|
| 469 |
+
- visionary hope
|
| 470 |
+
|
| 471 |
+
- name: Grief
|
| 472 |
+
code: FAM-GRI
|
| 473 |
+
variants:
|
| 474 |
+
- name: Grief
|
| 475 |
+
code: VAR-GRI-01
|
| 476 |
+
aliases:
|
| 477 |
+
- grief
|
| 478 |
+
- grieving
|
| 479 |
+
- deep sorrow
|
| 480 |
+
- name: Bereavement
|
| 481 |
+
code: VAR-GRI-02
|
| 482 |
+
aliases:
|
| 483 |
+
- bereavement
|
| 484 |
+
- mourning
|
| 485 |
+
- mourning a loss
|
| 486 |
+
- name: Nostalgic Grief
|
| 487 |
+
code: VAR-GRI-03
|
| 488 |
+
aliases:
|
| 489 |
+
- nostalgic grief
|
| 490 |
+
- missing the past
|
| 491 |
+
- name: Anticipatory Grief
|
| 492 |
+
code: VAR-GRI-04
|
| 493 |
+
aliases:
|
| 494 |
+
- anticipatory grief
|
| 495 |
+
- grief before loss
|
| 496 |
+
- name: Collective Grief
|
| 497 |
+
code: VAR-GRI-05
|
| 498 |
+
aliases:
|
| 499 |
+
- collective grief
|
| 500 |
+
- shared grief
|
config/meta_mappings.yaml
CHANGED
|
@@ -1,19 +1,25 @@
|
|
| 1 |
-
#
|
| 2 |
-
# META mappings — abstract markers (META-*) to blend interpretations or inference cues.
|
| 3 |
-
|
| 4 |
meta_mappings:
|
| 5 |
- marker: "META-FEAR-DISPLACED"
|
| 6 |
inferred_blend: "FEA+ANG"
|
| 7 |
comment: "Displaced fear, showing as irritability or frustration"
|
| 8 |
-
|
| 9 |
- marker: "META-SOCIAL-WITHDRAWAL"
|
| 10 |
inferred_blend: "SHA+SAD"
|
| 11 |
comment: "Withdrawing due to social shame or sadness"
|
| 12 |
-
|
| 13 |
- marker: "META-HOPE-FRAGILE"
|
| 14 |
inferred_blend: "HOP+FEA"
|
| 15 |
comment: "Hope mixed with fear of failure"
|
| 16 |
-
|
| 17 |
- marker: "META-ANGER-INTERNALIZED"
|
| 18 |
inferred_blend: "ANG+SHA"
|
| 19 |
comment: "Internalized anger, linked to shame"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Meta Mappings
|
|
|
|
|
|
|
| 2 |
meta_mappings:
|
| 3 |
- marker: "META-FEAR-DISPLACED"
|
| 4 |
inferred_blend: "FEA+ANG"
|
| 5 |
comment: "Displaced fear, showing as irritability or frustration"
|
| 6 |
+
|
| 7 |
- marker: "META-SOCIAL-WITHDRAWAL"
|
| 8 |
inferred_blend: "SHA+SAD"
|
| 9 |
comment: "Withdrawing due to social shame or sadness"
|
| 10 |
+
|
| 11 |
- marker: "META-HOPE-FRAGILE"
|
| 12 |
inferred_blend: "HOP+FEA"
|
| 13 |
comment: "Hope mixed with fear of failure"
|
| 14 |
+
|
| 15 |
- marker: "META-ANGER-INTERNALIZED"
|
| 16 |
inferred_blend: "ANG+SHA"
|
| 17 |
comment: "Internalized anger, linked to shame"
|
| 18 |
+
|
| 19 |
+
- marker: "META-JOY-SAD"
|
| 20 |
+
inferred_blend: "JOY+SAD"
|
| 21 |
+
comment: "Joy mixed with sadness, often seen in complex emotional states like nostalgia or bittersweet moments"
|
| 22 |
+
|
| 23 |
+
- marker: "META-ANGER-FEAR"
|
| 24 |
+
inferred_blend: "ANG+FEA"
|
| 25 |
+
comment: "Fear showing as anger, often in defensive or protective reactions"
|
config/resonance_mapping.yaml
ADDED
|
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
resonance_mapping:
|
| 2 |
+
FAM-JOY: High energy, warm
|
| 3 |
+
FAM-SAD: Low energy, cold
|
| 4 |
+
FAM-ANG: Intense, sharp
|
| 5 |
+
FAM-FEA: Tense, escalating
|
| 6 |
+
FAM-LOV: Warm, supportive
|
| 7 |
+
FAM-DISG: Cold, rejecting
|
| 8 |
+
FAM-CUR: Focused, inquisitive
|
| 9 |
+
FAM-GUI: Heavy, constrictive
|
| 10 |
+
FAM-SHA: Heavy, cold
|
| 11 |
+
FAM-SUR: Sharp, bright
|
| 12 |
+
FAM-PLA: Light, playful
|
| 13 |
+
FAM-TRU: Steady, warm
|
| 14 |
+
FAM-HOP: Light, rising
|
| 15 |
+
FAM-GRI: Deep, cold
|
config/response_strategies.yaml
ADDED
|
@@ -0,0 +1,65 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
response_strategies:
|
| 2 |
+
FAM-ANG:
|
| 3 |
+
rsm_code: "RSM-DEESCALATE"
|
| 4 |
+
strategy: "De-escalation & Boundary Validation"
|
| 5 |
+
sample_response: "It’s completely understandable to feel this way when your boundaries feel crossed. You're not alone in this, and it’s okay to take space to ground and process."
|
| 6 |
+
|
| 7 |
+
FAM-SAD:
|
| 8 |
+
rsm_code: "RSM-COMFORT"
|
| 9 |
+
strategy: "Compassion & Emotional Holding"
|
| 10 |
+
sample_response: "I can feel how heavy this is for you. It’s okay to be in this place right now — and to know that this isn’t something you have to carry alone."
|
| 11 |
+
|
| 12 |
+
FAM-FEA:
|
| 13 |
+
rsm_code: "RSM-REASSURE"
|
| 14 |
+
strategy: "Reassurance & Safety Anchoring"
|
| 15 |
+
sample_response: "Your fears are valid. You’ve already shown strength by expressing them — and you deserve to feel safe and supported."
|
| 16 |
+
|
| 17 |
+
FAM-JOY:
|
| 18 |
+
rsm_code: "RSM-AFFIRMJOY"
|
| 19 |
+
strategy: "Shared Joy & Positive Resonance"
|
| 20 |
+
sample_response: "That’s wonderful to hear! I’m glad this brings you joy — may this feeling stay with you."
|
| 21 |
+
|
| 22 |
+
FAM-LOV:
|
| 23 |
+
rsm_code: "RSM-RECIPROCATECARE"
|
| 24 |
+
strategy: "Affection & Trust Holding"
|
| 25 |
+
sample_response: "That feeling of love is something so valuable. It’s okay to honor it at your own pace — and to let it support you."
|
| 26 |
+
|
| 27 |
+
FAM-SHA:
|
| 28 |
+
rsm_code: "RSM-SHAME-SOOTHE"
|
| 29 |
+
strategy: "Shame Soothing & Self-Worth Support"
|
| 30 |
+
sample_response: "Please know that shame can feel so isolating — but you are worthy of kindness and compassion, exactly as you are."
|
| 31 |
+
|
| 32 |
+
FAM-GUI:
|
| 33 |
+
rsm_code: "RSM-REPAIR"
|
| 34 |
+
strategy: "Self-Compassion & Repair Pathway"
|
| 35 |
+
sample_response: "Guilt can weigh so much because it shows you care. It’s also okay to begin the process of forgiving yourself and moving toward repair."
|
| 36 |
+
|
| 37 |
+
FAM-LON:
|
| 38 |
+
rsm_code: "RSM-CONNECTINVITE"
|
| 39 |
+
strategy: "Connection Invitation & Compassionate Presence"
|
| 40 |
+
sample_response: "That sense of loneliness is deeply human. I’m here, and when you’re ready, reaching out for connection is always a courageous act."
|
| 41 |
+
|
| 42 |
+
FAM-HEL:
|
| 43 |
+
rsm_code: "RSM-STABILIZE"
|
| 44 |
+
strategy: "Gentle Stabilization & Encouragement"
|
| 45 |
+
sample_response: "Helplessness is so hard to sit with — but reaching out shows strength. One step at a time is more than enough right now."
|
| 46 |
+
|
| 47 |
+
FAM-HOP:
|
| 48 |
+
rsm_code: "RSM-HOPE-AMPLIFY"
|
| 49 |
+
strategy: "Hope Amplification & Gentle Encouragement"
|
| 50 |
+
sample_response: "Holding onto hope can be an act of great courage. I believe in your resilience — you are moving forward even now."
|
| 51 |
+
|
| 52 |
+
FAM-SUR:
|
| 53 |
+
rsm_code: "RSM-GROUND-SURPRISE"
|
| 54 |
+
strategy: "Grounding & Acknowledgment of Surprise"
|
| 55 |
+
sample_response: "That sounds like it took you by surprise. Take your time — it’s okay to process this at your own pace."
|
| 56 |
+
|
| 57 |
+
FAM-DIS:
|
| 58 |
+
rsm_code: "RSM-ACKNOWLEDGE-BOUNDARY"
|
| 59 |
+
strategy: "Respectful Acknowledgment & Containment"
|
| 60 |
+
sample_response: "That reaction makes sense — it’s okay to have that boundary or sense of rejection toward what you experienced."
|
| 61 |
+
|
| 62 |
+
FAM-NEU:
|
| 63 |
+
rsm_code: "RSM-INQUIRE"
|
| 64 |
+
strategy: "Gentle Curiosity & Emotional Invitation"
|
| 65 |
+
sample_response: "Sometimes emotions aren’t clear — and that’s completely okay. When you're ready, I’m here to explore it with you."
|
config/sal_triggers.yaml
CHANGED
|
@@ -1,22 +1,33 @@
|
|
| 1 |
-
#
|
| 2 |
-
# Symbolic Ambiguity Layer (SAL) — phrases, tokens, or patterns that indicate emotional ambiguity, suppression, masking.
|
| 3 |
-
|
| 4 |
sal_triggers:
|
| 5 |
- trigger: "I'm fine"
|
| 6 |
-
meaning: "
|
| 7 |
flag: "SAL-001"
|
|
|
|
| 8 |
- trigger: "It doesn't matter"
|
| 9 |
-
meaning: "
|
| 10 |
flag: "SAL-002"
|
|
|
|
| 11 |
- trigger: "I'm okay, really"
|
| 12 |
-
meaning: "
|
| 13 |
flag: "SAL-003"
|
| 14 |
-
|
| 15 |
-
|
|
|
|
| 16 |
flag: "SAL-004"
|
|
|
|
| 17 |
- trigger: "I don't care"
|
| 18 |
-
meaning: "
|
| 19 |
flag: "SAL-005"
|
| 20 |
-
|
| 21 |
-
|
|
|
|
| 22 |
flag: "SAL-006"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# SAL Triggers
|
|
|
|
|
|
|
| 2 |
sal_triggers:
|
| 3 |
- trigger: "I'm fine"
|
| 4 |
+
meaning: "Possible suppression or avoidance"
|
| 5 |
flag: "SAL-001"
|
| 6 |
+
|
| 7 |
- trigger: "It doesn't matter"
|
| 8 |
+
meaning: "Resignation, possible sadness"
|
| 9 |
flag: "SAL-002"
|
| 10 |
+
|
| 11 |
- trigger: "I'm okay, really"
|
| 12 |
+
meaning: "Masking, potential hidden emotion"
|
| 13 |
flag: "SAL-003"
|
| 14 |
+
|
| 15 |
+
- trigger: "Whatever"
|
| 16 |
+
meaning: "Emotional disengagement"
|
| 17 |
flag: "SAL-004"
|
| 18 |
+
|
| 19 |
- trigger: "I don't care"
|
| 20 |
+
meaning: "Possible anger or frustration"
|
| 21 |
flag: "SAL-005"
|
| 22 |
+
|
| 23 |
+
- trigger: "No big deal"
|
| 24 |
+
meaning: "Minimization of emotional impact"
|
| 25 |
flag: "SAL-006"
|
| 26 |
+
|
| 27 |
+
- trigger: "I'm just tired"
|
| 28 |
+
meaning: "Emotional exhaustion, hiding deeper frustration or sadness"
|
| 29 |
+
flag: "SAL-008"
|
| 30 |
+
|
| 31 |
+
- trigger: "I don't know what you want from me"
|
| 32 |
+
meaning: "Defensiveness or frustration with expectations"
|
| 33 |
+
flag: "SAL-009"
|
core/codex_informer.py
CHANGED
|
@@ -1,69 +1,92 @@
|
|
| 1 |
# core/codex_informer.py
|
|
|
|
| 2 |
|
| 3 |
import yaml
|
| 4 |
|
| 5 |
class CodexInformer:
|
| 6 |
-
|
|
|
|
|
|
|
| 7 |
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
# Load the emotion families, sal triggers, and meta mappings from YAML files
|
| 12 |
-
cls._instance.load_data()
|
| 13 |
-
return cls._instance
|
| 14 |
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
self.sal_triggers = self.load_yaml('config/sal_triggers.yaml')['sal_triggers']
|
| 19 |
-
self.meta_mappings = self.load_yaml('config/meta_mappings.yaml')['meta_mappings']
|
| 20 |
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
|
|
|
|
|
|
|
|
|
| 25 |
|
| 26 |
-
def get_emotion_family(self, emotion_name):
|
| 27 |
-
# Retrieve the emotion family code based on emotion name (e.g., "joy" -> "FAM-JOY")
|
| 28 |
for family in self.emotion_families:
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
|
| 33 |
-
|
| 34 |
-
# Retrieve the arc for the given emotion code (this can be enhanced to map arc based on emotion family)
|
| 35 |
-
return self.get_emotion_family(emotion_code) # Placeholder for arc mapping logic
|
| 36 |
|
| 37 |
-
def
|
| 38 |
-
#
|
| 39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 51 |
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
if trigger['trigger'].lower() in token.lower():
|
| 57 |
-
codex_hits.append({
|
| 58 |
-
'type': 'SAL_TRIGGER',
|
| 59 |
-
'flag': trigger['flag'],
|
| 60 |
-
'meaning': trigger['meaning'],
|
| 61 |
-
'token': token
|
| 62 |
-
})
|
| 63 |
-
return codex_hits
|
| 64 |
|
| 65 |
-
def
|
| 66 |
-
|
| 67 |
-
|
| 68 |
-
|
| 69 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# core/codex_informer.py
|
| 2 |
+
# Master Emotional Core (MEC) - Codex Informer v3 (Memory-Optimized)
|
| 3 |
|
| 4 |
import yaml
|
| 5 |
|
| 6 |
class CodexInformer:
|
| 7 |
+
def __init__(self, emotion_families_path='config/emotion_families.yaml',
|
| 8 |
+
arc_mapping_path='config/arc_mapping.yaml',
|
| 9 |
+
resonance_mapping_path='config/resonance_mapping.yaml'):
|
| 10 |
|
| 11 |
+
# Load emotion families YAML
|
| 12 |
+
with open(emotion_families_path, 'r', encoding='utf-8') as f:
|
| 13 |
+
self.emotion_families = yaml.safe_load(f)['emotion_families']
|
|
|
|
|
|
|
|
|
|
| 14 |
|
| 15 |
+
# Load arc mapping YAML
|
| 16 |
+
with open(arc_mapping_path, 'r', encoding='utf-8') as f:
|
| 17 |
+
self.arc_lookup = yaml.safe_load(f)['arc_mapping']
|
|
|
|
|
|
|
| 18 |
|
| 19 |
+
# Load resonance mapping YAML
|
| 20 |
+
with open(resonance_mapping_path, 'r', encoding='utf-8') as f:
|
| 21 |
+
self.resonance_lookup = yaml.safe_load(f)['resonance_mapping']
|
| 22 |
+
|
| 23 |
+
# Build lookup tables
|
| 24 |
+
self.family_lookup = {}
|
| 25 |
+
self.variant_lookup = {}
|
| 26 |
|
|
|
|
|
|
|
| 27 |
for family in self.emotion_families:
|
| 28 |
+
fam_code = family['code']
|
| 29 |
+
self.family_lookup[fam_code] = True
|
| 30 |
+
for variant in family.get('variants', []):
|
| 31 |
+
var_code = variant['code']
|
| 32 |
+
self.variant_lookup[var_code] = {
|
| 33 |
+
'family_code': fam_code
|
| 34 |
+
}
|
| 35 |
|
| 36 |
+
print(f"[CodexInformer] Loaded {len(self.family_lookup)} families, {len(self.variant_lookup)} variants, {len(self.arc_lookup)} arcs, {len(self.resonance_lookup)} resonance patterns")
|
|
|
|
|
|
|
| 37 |
|
| 38 |
+
def resolve_emotion_family(self, emotion_code):
|
| 39 |
+
# CASE 1: Variant code
|
| 40 |
+
if emotion_code.startswith("VAR-"):
|
| 41 |
+
if emotion_code in self.variant_lookup:
|
| 42 |
+
fam_code = self.variant_lookup[emotion_code]['family_code']
|
| 43 |
+
arc = self.arc_lookup.get(fam_code, "Stable")
|
| 44 |
+
resonance = self.resonance_lookup.get(fam_code, "Neutral")
|
| 45 |
+
return {
|
| 46 |
+
'primary_emotion_code': emotion_code,
|
| 47 |
+
'emotion_family': fam_code,
|
| 48 |
+
'arc': arc,
|
| 49 |
+
'resonance': resonance
|
| 50 |
+
}
|
| 51 |
+
else:
|
| 52 |
+
print(f"[CodexInformer] Unknown VAR code: {emotion_code}")
|
| 53 |
+
return self._unknown_response()
|
| 54 |
|
| 55 |
+
# CASE 2: Family code
|
| 56 |
+
elif emotion_code.startswith("FAM-"):
|
| 57 |
+
if emotion_code in self.family_lookup:
|
| 58 |
+
arc = self.arc_lookup.get(emotion_code, "Stable")
|
| 59 |
+
resonance = self.resonance_lookup.get(emotion_code, "Neutral")
|
| 60 |
+
return {
|
| 61 |
+
'primary_emotion_code': emotion_code,
|
| 62 |
+
'emotion_family': emotion_code,
|
| 63 |
+
'arc': arc,
|
| 64 |
+
'resonance': resonance
|
| 65 |
+
}
|
| 66 |
+
else:
|
| 67 |
+
print(f"[CodexInformer] Unknown FAM code: {emotion_code}")
|
| 68 |
+
return self._unknown_response()
|
| 69 |
|
| 70 |
+
# CASE 3: Invalid format
|
| 71 |
+
else:
|
| 72 |
+
print(f"[CodexInformer] Unknown code format: {emotion_code}")
|
| 73 |
+
return self._unknown_response()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 74 |
|
| 75 |
+
def build_alias_lookup(self):
|
| 76 |
+
alias_lookup = {}
|
| 77 |
+
for family in self.emotion_families:
|
| 78 |
+
for variant in family.get('variants', []):
|
| 79 |
+
variant_code = variant['code']
|
| 80 |
+
aliases = variant.get('aliases', [])
|
| 81 |
+
alias_lookup[variant['name'].lower()] = variant_code
|
| 82 |
+
for alias in aliases:
|
| 83 |
+
alias_lookup[alias.lower()] = variant_code
|
| 84 |
+
return alias_lookup
|
| 85 |
+
|
| 86 |
+
def _unknown_response(self):
|
| 87 |
+
return {
|
| 88 |
+
'primary_emotion_code': "Unknown",
|
| 89 |
+
'emotion_family': "Unknown",
|
| 90 |
+
'arc': "Unknown",
|
| 91 |
+
'resonance': "Unknown"
|
| 92 |
+
}
|
core/eil_processor.py
CHANGED
|
@@ -1,59 +1,257 @@
|
|
| 1 |
# core/eil_processor.py
|
|
|
|
| 2 |
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
|
| 5 |
class EILProcessor:
|
| 6 |
-
def __init__(self,
|
| 7 |
-
self.
|
| 8 |
-
self.
|
| 9 |
-
|
| 10 |
-
#
|
| 11 |
-
self.
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
#
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
}
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
emotion_family = self.codex_informer.get_emotion_family(emotion_code) # Resolve emotion family
|
| 37 |
-
arc = self.codex_informer.get_arc(emotion_code) # Resolve emotion arc
|
| 38 |
-
resonance = self.codex_informer.get_resonance(emotion_code) # Resolve resonance
|
| 39 |
-
|
| 40 |
-
# If we don't find a match, we can use a fallback or 'hidden emotion' state
|
| 41 |
-
if emotion_family == "Unknown":
|
| 42 |
-
emotion_family = "Hidden Emotion Detected" # Placeholder for hidden emotion
|
| 43 |
-
|
| 44 |
-
# Build the EIL packet with additional emotion data from Codex Informer
|
| 45 |
-
eil_packet = {
|
| 46 |
-
"phrases": phrases,
|
| 47 |
-
"emotion_candidates": [
|
| 48 |
-
{"phrase": p.strip(), "candidate_emotion": "Pending"} for p in phrases if p.strip()
|
| 49 |
-
],
|
| 50 |
-
"metadata": {
|
| 51 |
-
"source": "InputPreprocessor + EILProcessor"
|
| 52 |
-
},
|
| 53 |
-
"emotion_family": emotion_family, # From Codex Informer
|
| 54 |
-
"arc": arc, # From Codex Informer
|
| 55 |
-
"resonance": resonance # From Codex Informer
|
| 56 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 57 |
|
| 58 |
-
|
| 59 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# core/eil_processor.py
|
| 2 |
+
# Master Emotional Core (MEC) - EIL Processor (Signal Normalization Edition)
|
| 3 |
|
| 4 |
+
import yaml
|
| 5 |
+
import re
|
| 6 |
+
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
| 7 |
+
import torch
|
| 8 |
+
import torch.nn.functional as F
|
| 9 |
|
| 10 |
class EILProcessor:
|
| 11 |
+
def __init__(self, codex_informer, softmax_threshold=0.6):
|
| 12 |
+
self.codex_informer = codex_informer
|
| 13 |
+
self.softmax_threshold = softmax_threshold
|
| 14 |
+
|
| 15 |
+
# Build alias lookup from Codex
|
| 16 |
+
self.alias_lookup = self.codex_informer.build_alias_lookup()
|
| 17 |
+
print(f"[EILProcessor] Alias map loaded with {len(self.alias_lookup)} entries")
|
| 18 |
+
|
| 19 |
+
# Load crosswalk.yaml
|
| 20 |
+
with open('config/crosswalk.yaml', 'r', encoding='utf-8') as f:
|
| 21 |
+
yaml_data = yaml.safe_load(f)
|
| 22 |
+
crosswalk_data = yaml_data['crosswalk']
|
| 23 |
+
story_pattern_data = yaml_data.get('story_patterns', [])
|
| 24 |
+
|
| 25 |
+
# Build crosswalk lookup
|
| 26 |
+
self.crosswalk_lookup = {}
|
| 27 |
+
for entry in crosswalk_data:
|
| 28 |
+
phrase = self.normalize_text(entry['phrase'])
|
| 29 |
+
emotion_code = entry['emotion_code']
|
| 30 |
+
self.crosswalk_lookup[phrase] = emotion_code
|
| 31 |
+
|
| 32 |
+
# Build story_patterns lookup
|
| 33 |
+
self.story_patterns_lookup = {}
|
| 34 |
+
for entry in story_pattern_data:
|
| 35 |
+
pattern = self.normalize_text(entry['pattern'])
|
| 36 |
+
emotion_code = entry['emotion_code']
|
| 37 |
+
self.story_patterns_lookup[pattern] = emotion_code
|
| 38 |
+
|
| 39 |
+
print(f"[EILProcessor] Crosswalk loaded with {len(self.crosswalk_lookup)} entries")
|
| 40 |
+
print(f"[EILProcessor] Story Patterns loaded with {len(self.story_patterns_lookup)} entries")
|
| 41 |
+
|
| 42 |
+
# Emotion keyword dictionary for signal normalization/blending
|
| 43 |
+
self.emotion_keyword_map = {
|
| 44 |
+
"FAM-ANG": ["anger", "angry", "hate", "furious", "rage", "resentment"],
|
| 45 |
+
"FAM-HEL": ["helpless", "powerless", "can't", "unable", "trapped", "stuck"],
|
| 46 |
+
"FAM-SAD": ["sad", "down", "unhappy", "miserable", "depressed", "blue"],
|
| 47 |
+
"FAM-FEA": ["afraid", "scared", "fear", "terrified", "worried", "nervous", "anxious"],
|
| 48 |
+
"FAM-LOV": ["love", "loved", "loving", "caring", "affection"],
|
| 49 |
+
"FAM-JOY": ["joy", "happy", "excited", "delighted", "content"],
|
| 50 |
+
"FAM-SUR": ["surprised", "amazed", "astonished", "shocked"],
|
| 51 |
+
"FAM-DIS": ["disgust", "disgusted", "gross", "revolted"],
|
| 52 |
+
"FAM-SHA": ["ashamed", "shame", "embarrassed", "humiliated"],
|
| 53 |
+
"FAM-GUI": ["guilty", "guilt", "remorse", "regret"],
|
| 54 |
+
# Add more as needed
|
| 55 |
}
|
| 56 |
+
|
| 57 |
+
# Load tokenizer and model
|
| 58 |
+
self.tokenizer = AutoTokenizer.from_pretrained('cardiffnlp/twitter-roberta-base-emotion')
|
| 59 |
+
self.model = AutoModelForSequenceClassification.from_pretrained('cardiffnlp/twitter-roberta-base-emotion')
|
| 60 |
+
|
| 61 |
+
def normalize_text(self, text):
|
| 62 |
+
normalization_map = {
|
| 63 |
+
"i am feeling ": "",
|
| 64 |
+
"i feel ": "",
|
| 65 |
+
"feeling ": "",
|
| 66 |
+
"i'm feeling ": "",
|
| 67 |
+
"i am ": "",
|
| 68 |
+
"i'm ": ""
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 69 |
}
|
| 70 |
+
text = text.lower().strip()
|
| 71 |
+
for k, v in normalization_map.items():
|
| 72 |
+
if text.startswith(k):
|
| 73 |
+
text = text.replace(k, "", 1)
|
| 74 |
+
break
|
| 75 |
+
text = re.sub(r'[.!?]', '', text)
|
| 76 |
+
return text
|
| 77 |
+
|
| 78 |
+
def is_story_input(self, text):
|
| 79 |
+
clause_markers = [',', ';', '.', 'but', 'because', 'so that', 'which', 'when', 'while']
|
| 80 |
+
token_count = len(text.split())
|
| 81 |
+
clause_hits = any(marker in text for marker in clause_markers)
|
| 82 |
+
if token_count > 12 or clause_hits:
|
| 83 |
+
return True
|
| 84 |
+
return False
|
| 85 |
+
|
| 86 |
+
def chunk_story(self, text):
|
| 87 |
+
# Also split on conjunctions and relative pronouns, not just punctuation
|
| 88 |
+
chunks = re.split(r'[.,;!?]|\b(?:and|but|because|so|although|though|while|when)\b', text, flags=re.IGNORECASE)
|
| 89 |
+
chunks = [chunk.strip() for chunk in chunks if chunk and chunk.strip()]
|
| 90 |
+
return chunks
|
| 91 |
+
|
| 92 |
+
def detect_emotion_blend(self, norm_text):
|
| 93 |
+
blend = {}
|
| 94 |
+
for fam, keywords in self.emotion_keyword_map.items():
|
| 95 |
+
for kw in keywords:
|
| 96 |
+
if kw in norm_text:
|
| 97 |
+
blend[fam] = blend.get(fam, 0) + 1.0
|
| 98 |
+
return blend
|
| 99 |
+
|
| 100 |
+
def infer_emotion(self, input_text):
|
| 101 |
+
norm_text = self.normalize_text(input_text)
|
| 102 |
+
|
| 103 |
+
# 1️⃣ Story Pattern Override
|
| 104 |
+
if norm_text in self.story_patterns_lookup:
|
| 105 |
+
primary_emotion_code = self.story_patterns_lookup[norm_text]
|
| 106 |
+
emotion_data = self.codex_informer.resolve_emotion_family(primary_emotion_code)
|
| 107 |
+
print(f"[EILProcessor] Story Pattern match: '{norm_text}' → {primary_emotion_code}")
|
| 108 |
+
packet = {
|
| 109 |
+
'phrases': [input_text],
|
| 110 |
+
'emotion_candidates': [{'phrase': input_text, 'candidate_emotion': primary_emotion_code}],
|
| 111 |
+
'metadata': {'source': 'EILProcessor (story pattern)', 'input_type': 'story'},
|
| 112 |
+
'emotion_family': emotion_data['emotion_family'],
|
| 113 |
+
'primary_emotion_code': emotion_data['primary_emotion_code'],
|
| 114 |
+
'arc': emotion_data['arc'],
|
| 115 |
+
'resonance': emotion_data['resonance'],
|
| 116 |
+
'blend': {emotion_data['primary_emotion_code']: 1.0}
|
| 117 |
+
}
|
| 118 |
+
return packet
|
| 119 |
+
|
| 120 |
+
# 2️⃣ Story detection (chunking and blend aggregation)
|
| 121 |
+
input_type = 'phrase'
|
| 122 |
+
if self.is_story_input(norm_text):
|
| 123 |
+
input_type = 'story'
|
| 124 |
+
print(f"[EILProcessor] Story mode activated for input: '{norm_text}'")
|
| 125 |
+
chunks = self.chunk_story(norm_text)
|
| 126 |
|
| 127 |
+
chunk_results = []
|
| 128 |
+
blend_accum = {}
|
| 129 |
+
|
| 130 |
+
for chunk in chunks:
|
| 131 |
+
sub_result = self.infer_emotion(chunk) # RECURSIVE CALL
|
| 132 |
+
chunk_results.append(sub_result)
|
| 133 |
+
# Accumulate blends
|
| 134 |
+
for fam, val in sub_result.get('blend', {}).items():
|
| 135 |
+
blend_accum[fam] = blend_accum.get(fam, 0) + val
|
| 136 |
+
|
| 137 |
+
# Normalize blend
|
| 138 |
+
if blend_accum:
|
| 139 |
+
total = sum(blend_accum.values())
|
| 140 |
+
for k in blend_accum:
|
| 141 |
+
blend_accum[k] /= total
|
| 142 |
+
|
| 143 |
+
dominant_family = max(blend_accum.items(), key=lambda x: x[1])[0]
|
| 144 |
+
else:
|
| 145 |
+
dominant_family = "FAM-NEU"
|
| 146 |
+
blend_accum = {"FAM-NEU": 1.0}
|
| 147 |
+
|
| 148 |
+
emotion_data = self.codex_informer.resolve_emotion_family(dominant_family)
|
| 149 |
+
packet = {
|
| 150 |
+
'phrases': [input_text] + [r['phrases'][0] for r in chunk_results],
|
| 151 |
+
'emotion_candidates': [{'phrase': r['phrases'][0], 'candidate_emotion': r['primary_emotion_code']} for r in chunk_results],
|
| 152 |
+
'metadata': {'source': 'EILProcessor (story mode)', 'input_type': input_type},
|
| 153 |
+
'emotion_family': emotion_data['emotion_family'],
|
| 154 |
+
'primary_emotion_code': emotion_data['primary_emotion_code'],
|
| 155 |
+
'arc': emotion_data['arc'],
|
| 156 |
+
'resonance': emotion_data['resonance'],
|
| 157 |
+
'blend': blend_accum
|
| 158 |
+
}
|
| 159 |
+
return packet
|
| 160 |
+
|
| 161 |
+
# 3️⃣ Crosswalk check
|
| 162 |
+
if norm_text in self.crosswalk_lookup:
|
| 163 |
+
primary_emotion_code = self.crosswalk_lookup[norm_text]
|
| 164 |
+
emotion_data = self.codex_informer.resolve_emotion_family(primary_emotion_code)
|
| 165 |
+
print(f"[EILProcessor] Crosswalk match: '{norm_text}' → {primary_emotion_code}")
|
| 166 |
+
packet = {
|
| 167 |
+
'phrases': [input_text],
|
| 168 |
+
'emotion_candidates': [{'phrase': input_text, 'candidate_emotion': primary_emotion_code}],
|
| 169 |
+
'metadata': {'source': 'EILProcessor (crosswalk)', 'input_type': input_type},
|
| 170 |
+
'emotion_family': emotion_data['emotion_family'],
|
| 171 |
+
'primary_emotion_code': emotion_data['primary_emotion_code'],
|
| 172 |
+
'arc': emotion_data['arc'],
|
| 173 |
+
'resonance': emotion_data['resonance'],
|
| 174 |
+
'blend': {emotion_data['primary_emotion_code']: 1.0}
|
| 175 |
+
}
|
| 176 |
+
return packet
|
| 177 |
+
|
| 178 |
+
# 4️⃣ Alias lookup
|
| 179 |
+
if norm_text in self.alias_lookup:
|
| 180 |
+
variant_code = self.alias_lookup[norm_text]
|
| 181 |
+
emotion_family = variant_code.split('-')[1]
|
| 182 |
+
family_code = f"FAM-{emotion_family}"
|
| 183 |
+
print(f"[EILProcessor] Alias match: '{norm_text}' → {variant_code}")
|
| 184 |
+
packet = {
|
| 185 |
+
'phrases': [input_text],
|
| 186 |
+
'emotion_candidates': [{'phrase': input_text, 'candidate_emotion': variant_code}],
|
| 187 |
+
'metadata': {'source': 'EILProcessor (alias match)', 'input_type': input_type},
|
| 188 |
+
'emotion_family': family_code,
|
| 189 |
+
'primary_emotion_code': variant_code,
|
| 190 |
+
'arc': 'Pending',
|
| 191 |
+
'resonance': 'Pending',
|
| 192 |
+
'blend': {variant_code: 1.0}
|
| 193 |
+
}
|
| 194 |
+
return packet
|
| 195 |
+
|
| 196 |
+
# 5️⃣ Signal normalization - keyword blend check
|
| 197 |
+
blend = self.detect_emotion_blend(norm_text)
|
| 198 |
+
if blend:
|
| 199 |
+
# Normalize
|
| 200 |
+
total = sum(blend.values())
|
| 201 |
+
for k in blend:
|
| 202 |
+
blend[k] /= total
|
| 203 |
+
primary_code = max(blend.items(), key=lambda x: x[1])[0]
|
| 204 |
+
emotion_data = self.codex_informer.resolve_emotion_family(primary_code)
|
| 205 |
+
print(f"[EILProcessor] Signal normalization keyword blend: {blend} (primary: {primary_code})")
|
| 206 |
+
packet = {
|
| 207 |
+
'phrases': [input_text],
|
| 208 |
+
'emotion_candidates': [{'phrase': input_text, 'candidate_emotion': primary_code}],
|
| 209 |
+
'metadata': {'source': 'EILProcessor (signal normalization)', 'input_type': input_type},
|
| 210 |
+
'emotion_family': emotion_data['emotion_family'],
|
| 211 |
+
'primary_emotion_code': emotion_data['primary_emotion_code'],
|
| 212 |
+
'arc': emotion_data['arc'],
|
| 213 |
+
'resonance': emotion_data['resonance'],
|
| 214 |
+
'blend': blend
|
| 215 |
+
}
|
| 216 |
+
return packet
|
| 217 |
+
|
| 218 |
+
# 6️⃣ Model fallback
|
| 219 |
+
print(f"[EILProcessor] No crosswalk/alias/keyword match — running model on: '{norm_text}'")
|
| 220 |
+
tokens = self.tokenizer(norm_text, return_tensors='pt')
|
| 221 |
+
with torch.no_grad():
|
| 222 |
+
logits = self.model(**tokens).logits
|
| 223 |
+
probs = F.softmax(logits, dim=-1).squeeze()
|
| 224 |
+
top_prob, top_idx = torch.max(probs, dim=-1)
|
| 225 |
+
predicted_label = self.model.config.id2label[top_idx.item()]
|
| 226 |
+
confidence = top_prob.item()
|
| 227 |
+
|
| 228 |
+
if confidence < self.softmax_threshold:
|
| 229 |
+
predicted_label = 'neutral'
|
| 230 |
+
print(f"[EILProcessor] Low confidence ({confidence:.2f}) — setting to 'neutral'")
|
| 231 |
+
|
| 232 |
+
print(f"[EILProcessor] Model prediction: {predicted_label} ({confidence:.2f})")
|
| 233 |
+
model_to_codex_map = {
|
| 234 |
+
"joy": "FAM-JOY",
|
| 235 |
+
"anger": "FAM-ANG",
|
| 236 |
+
"sadness": "FAM-SAD",
|
| 237 |
+
"fear": "FAM-FEA",
|
| 238 |
+
"love": "FAM-LOV",
|
| 239 |
+
"surprise": "FAM-SUR",
|
| 240 |
+
"disgust": "FAM-DIS",
|
| 241 |
+
"neutral": "FAM-NEU"
|
| 242 |
+
}
|
| 243 |
+
primary_emotion_code = model_to_codex_map.get(predicted_label.lower(), "FAM-NEU")
|
| 244 |
+
emotion_data = self.codex_informer.resolve_emotion_family(primary_emotion_code)
|
| 245 |
+
blend = {emotion_data['primary_emotion_code']: 1.0}
|
| 246 |
+
|
| 247 |
+
packet = {
|
| 248 |
+
'phrases': [input_text],
|
| 249 |
+
'emotion_candidates': [{'phrase': input_text, 'candidate_emotion': predicted_label}],
|
| 250 |
+
'metadata': {'source': 'EILProcessor (model)', 'input_type': input_type},
|
| 251 |
+
'emotion_family': emotion_data['emotion_family'],
|
| 252 |
+
'primary_emotion_code': emotion_data['primary_emotion_code'],
|
| 253 |
+
'arc': emotion_data['arc'],
|
| 254 |
+
'resonance': emotion_data['resonance'],
|
| 255 |
+
'blend': blend
|
| 256 |
+
}
|
| 257 |
+
return packet
|
core/eris_reasoner.py
CHANGED
|
@@ -1,3 +1,5 @@
|
|
|
|
|
|
|
|
| 1 |
import hashlib
|
| 2 |
import time
|
| 3 |
from core.codex_informer import CodexInformer
|
|
@@ -5,41 +7,54 @@ from core.hei_inference import HEIInference
|
|
| 5 |
|
| 6 |
class ERISReasoner:
|
| 7 |
def __init__(self):
|
| 8 |
-
# Initialize Codex Informer as part of ERIS
|
| 9 |
self.codex_informer = CodexInformer()
|
| 10 |
|
| 11 |
def reason_emotion_state(self, esil_packet):
|
| 12 |
-
# Generate EmID
|
| 13 |
-
user_id = "user123"
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
#
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
|
|
|
| 24 |
uesp_packet = {
|
| 25 |
"Primary Emotion": primary_emotion,
|
| 26 |
"Primary Emotion Code": primary_emotion_code,
|
| 27 |
-
"Emotion Arc Trajectory": arc,
|
| 28 |
-
"Resonance Pattern": resonance,
|
| 29 |
-
"HEART Compliance Flags": ["HVC-000"], # Placeholder
|
| 30 |
"Empathy First Response": esil_packet.get("response", "Emotion being processed..."),
|
| 31 |
-
"emotion_family":
|
| 32 |
}
|
| 33 |
|
| 34 |
-
# If
|
| 35 |
-
if
|
| 36 |
-
print("[
|
| 37 |
hei = HEIInference()
|
| 38 |
-
uesp_packet = hei.detect_low_signal(uesp_packet)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 39 |
|
| 40 |
-
#
|
| 41 |
-
|
| 42 |
-
emid = hashlib.sha256(emid_string.encode()).hexdigest() # Simple hash for uniqueness
|
| 43 |
-
uesp_packet["EmID"] = emid
|
| 44 |
|
|
|
|
| 45 |
return uesp_packet
|
|
|
|
| 1 |
+
# core/eris_reasoner.py
|
| 2 |
+
|
| 3 |
import hashlib
|
| 4 |
import time
|
| 5 |
from core.codex_informer import CodexInformer
|
|
|
|
| 7 |
|
| 8 |
class ERISReasoner:
|
| 9 |
def __init__(self):
|
|
|
|
| 10 |
self.codex_informer = CodexInformer()
|
| 11 |
|
| 12 |
def reason_emotion_state(self, esil_packet):
|
| 13 |
+
# Generate EmID components
|
| 14 |
+
user_id = "user123"
|
| 15 |
+
|
| 16 |
+
# Get primary_emotion and primary_emotion_code from ESIL packet
|
| 17 |
+
primary_emotion = esil_packet.get("emotion_family", "Unknown")
|
| 18 |
+
primary_emotion_code = esil_packet.get("primary_emotion_code", "Unknown")
|
| 19 |
+
|
| 20 |
+
# Resolve arc and resonance via Codex using primary_emotion_code
|
| 21 |
+
resolved_family = self.codex_informer.resolve_emotion_family(primary_emotion_code)
|
| 22 |
+
arc = resolved_family['arc']
|
| 23 |
+
resonance = resolved_family['resonance']
|
| 24 |
+
|
| 25 |
+
# Build UESP packet
|
| 26 |
uesp_packet = {
|
| 27 |
"Primary Emotion": primary_emotion,
|
| 28 |
"Primary Emotion Code": primary_emotion_code,
|
| 29 |
+
"Emotion Arc Trajectory": arc,
|
| 30 |
+
"Resonance Pattern": resonance,
|
| 31 |
+
"HEART Compliance Flags": ["HVC-000"], # Placeholder
|
| 32 |
"Empathy First Response": esil_packet.get("response", "Emotion being processed..."),
|
| 33 |
+
"emotion_family": primary_emotion
|
| 34 |
}
|
| 35 |
|
| 36 |
+
# If no valid family — trigger HEI fallback
|
| 37 |
+
if primary_emotion_code == "Unknown" or primary_emotion == "FAM-HID" or primary_emotion == "Hidden Emotion Detected":
|
| 38 |
+
print("[ERISReasoner] No valid emotion family found — triggering HEI fallback...")
|
| 39 |
hei = HEIInference()
|
| 40 |
+
uesp_packet = hei.detect_low_signal(uesp_packet)
|
| 41 |
+
|
| 42 |
+
# Ensure Primary Emotion Code survives even after HEI fallback
|
| 43 |
+
if "Primary Emotion Code" not in uesp_packet:
|
| 44 |
+
uesp_packet["Primary Emotion Code"] = primary_emotion_code
|
| 45 |
+
|
| 46 |
+
# Build clean family code string for EmID
|
| 47 |
+
family_code_str = uesp_packet['Primary Emotion'].replace("FAM-", "")
|
| 48 |
+
|
| 49 |
+
# Get current timestamp (ISO-like)
|
| 50 |
+
timestamp_str = time.strftime("%Y%m%dT%H%M%S")
|
| 51 |
+
|
| 52 |
+
# Build hashed payload
|
| 53 |
+
emid_payload = f"{user_id}-{family_code_str}-{timestamp_str}"
|
| 54 |
+
emid_hash = hashlib.sha256(emid_payload.encode()).hexdigest()
|
| 55 |
|
| 56 |
+
# Final EmID: user-JOY-20250623T203633-<hash>
|
| 57 |
+
uesp_packet["EmID"] = f"user-{family_code_str}-{timestamp_str}-{emid_hash}"
|
|
|
|
|
|
|
| 58 |
|
| 59 |
+
print(f"[ERISReasoner] Generated UESP Packet: {uesp_packet}")
|
| 60 |
return uesp_packet
|
core/esil_inference.py
CHANGED
|
@@ -26,9 +26,13 @@ class ESILInference:
|
|
| 26 |
|
| 27 |
# Retrieve emotion family, arc, and resonance from Codex Informer
|
| 28 |
primary_emotion_code = eil_packet.get("primary_emotion_code", "UNK")
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
|
| 33 |
# If no emotion family is found, flag it as a "hidden emotion"
|
| 34 |
if emotion_family == "Unknown":
|
|
@@ -44,6 +48,7 @@ class ESILInference:
|
|
| 44 |
"emotion_family": emotion_family, # From Codex Informer
|
| 45 |
"arc": arc, # From Codex Informer
|
| 46 |
"resonance": resonance, # From Codex Informer
|
|
|
|
| 47 |
"source_metadata": eil_packet.get("metadata", {}),
|
| 48 |
"tokens": phrases
|
| 49 |
}
|
|
|
|
| 26 |
|
| 27 |
# Retrieve emotion family, arc, and resonance from Codex Informer
|
| 28 |
primary_emotion_code = eil_packet.get("primary_emotion_code", "UNK")
|
| 29 |
+
|
| 30 |
+
# Ensure the primary emotion code is resolved correctly by CodexInformer
|
| 31 |
+
emotion_data = self.codex_informer.resolve_emotion_family(primary_emotion_code)
|
| 32 |
+
|
| 33 |
+
emotion_family = emotion_data['emotion_family']
|
| 34 |
+
arc = emotion_data['arc']
|
| 35 |
+
resonance = emotion_data['resonance']
|
| 36 |
|
| 37 |
# If no emotion family is found, flag it as a "hidden emotion"
|
| 38 |
if emotion_family == "Unknown":
|
|
|
|
| 48 |
"emotion_family": emotion_family, # From Codex Informer
|
| 49 |
"arc": arc, # From Codex Informer
|
| 50 |
"resonance": resonance, # From Codex Informer
|
| 51 |
+
"primary_emotion_code": primary_emotion_code, # <-- PATCH INCLUDED
|
| 52 |
"source_metadata": eil_packet.get("metadata", {}),
|
| 53 |
"tokens": phrases
|
| 54 |
}
|
core/fec_controller.py
CHANGED
|
@@ -1,24 +1,61 @@
|
|
|
|
|
|
|
|
|
|
|
| 1 |
class FECController:
|
| 2 |
def __init__(self):
|
| 3 |
pass
|
| 4 |
|
| 5 |
def generate_prompt(self, final_uesp):
|
| 6 |
-
# Validate
|
| 7 |
-
required_keys = [
|
| 8 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 9 |
for key in required_keys:
|
| 10 |
if key not in final_uesp:
|
| 11 |
raise KeyError(f"Missing required key: {key}")
|
| 12 |
-
|
| 13 |
-
# If all keys are valid, construct the prompt
|
| 14 |
-
primary_emotion = final_uesp['emotion_family']
|
| 15 |
-
primary_emotion_code = final_uesp['Primary Emotion Code']
|
| 16 |
|
| 17 |
-
#
|
| 18 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
fusion_prompt += f"\nResponse: {final_uesp['Empathy First Response']}"
|
| 23 |
|
| 24 |
return fusion_prompt
|
|
|
|
| 1 |
+
# core/fec_controller.py
|
| 2 |
+
# Master Emotional Core (MEC) - FECController v1.0 (LLM Instruction Layer)
|
| 3 |
+
|
| 4 |
class FECController:
|
| 5 |
def __init__(self):
|
| 6 |
pass
|
| 7 |
|
| 8 |
def generate_prompt(self, final_uesp):
|
| 9 |
+
# Validate required keys
|
| 10 |
+
required_keys = [
|
| 11 |
+
'emotion_family',
|
| 12 |
+
'Primary Emotion Code',
|
| 13 |
+
'Emotion Arc Trajectory',
|
| 14 |
+
'Resonance Pattern'
|
| 15 |
+
]
|
| 16 |
+
|
| 17 |
for key in required_keys:
|
| 18 |
if key not in final_uesp:
|
| 19 |
raise KeyError(f"Missing required key: {key}")
|
|
|
|
|
|
|
|
|
|
|
|
|
| 20 |
|
| 21 |
+
# Map human-readable emotion label from code
|
| 22 |
+
emotion_code = final_uesp['Primary Emotion Code']
|
| 23 |
+
# For MVP, just simplify code extraction:
|
| 24 |
+
if emotion_code.startswith("VAR-"):
|
| 25 |
+
emotion_label = emotion_code.replace("VAR-", "").split('-')[0].title()
|
| 26 |
+
elif emotion_code.startswith("FAM-"):
|
| 27 |
+
emotion_label = emotion_code.replace("FAM-", "").title()
|
| 28 |
+
else:
|
| 29 |
+
emotion_label = "Unknown"
|
| 30 |
+
|
| 31 |
+
# Base fields
|
| 32 |
+
arc = final_uesp['Emotion Arc Trajectory']
|
| 33 |
+
resonance = final_uesp['Resonance Pattern']
|
| 34 |
+
|
| 35 |
+
# Placeholder — eventually populated by ERIS/FEC blends
|
| 36 |
+
blend_states = final_uesp.get('Blend States', 'None detected')
|
| 37 |
+
|
| 38 |
+
# Placeholder — Response Strategy (will be routed later by ERIS/FEC)
|
| 39 |
+
response_strategy = final_uesp.get('Response Strategy', 'RSM-DEFAULT')
|
| 40 |
+
|
| 41 |
+
# Placeholder — T-Code (if tone classifiers applied — for now None)
|
| 42 |
+
tone_classification = final_uesp.get('Tone Classification', 'TBD')
|
| 43 |
+
|
| 44 |
+
# Build Fusion Prompt
|
| 45 |
+
fusion_prompt = f"""
|
| 46 |
+
Contextual Emotional State:
|
| 47 |
+
|
| 48 |
+
- Primary Emotion: {emotion_label}
|
| 49 |
+
- Emotional Arc: {arc}
|
| 50 |
+
- Resonance: {resonance}
|
| 51 |
+
- Tone: {tone_classification}
|
| 52 |
+
- Blend States: {blend_states}
|
| 53 |
+
- Intervention Strategy: {response_strategy}
|
| 54 |
+
|
| 55 |
+
Empathic Objective:
|
| 56 |
+
Align your response tone and content with the above emotional context. Prioritize emotional authenticity, HEART-compliant care, and safety. Do not merely reflect emotion — respond in an emotionally aligned and supportive manner that honors the user’s lived experience.
|
| 57 |
|
| 58 |
+
HEART Compliance: Response must align with HEART™ ethical principles — transparency, alignment, contextuality, traceability, and emotional safety.
|
| 59 |
+
""".strip()
|
|
|
|
| 60 |
|
| 61 |
return fusion_prompt
|
gradio_ui.py
ADDED
|
@@ -0,0 +1,100 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import gradio as gr
|
| 2 |
+
import yaml
|
| 3 |
+
import os
|
| 4 |
+
|
| 5 |
+
from core.codex_informer import CodexInformer
|
| 6 |
+
from core.eil_processor import EILProcessor
|
| 7 |
+
from core.esil_inference import ESILInference
|
| 8 |
+
from core.eris_reasoner import ERISReasoner
|
| 9 |
+
from core.fec_controller import FECController
|
| 10 |
+
|
| 11 |
+
# Initialize MEC Core Components
|
| 12 |
+
codex_informer = CodexInformer()
|
| 13 |
+
eil_processor = EILProcessor(codex_informer)
|
| 14 |
+
esil_formatter = ESILInference(codex_informer)
|
| 15 |
+
eris_engine = ERISReasoner()
|
| 16 |
+
fec_controller = FECController()
|
| 17 |
+
|
| 18 |
+
# Load Response Strategies YAML
|
| 19 |
+
config_path = os.path.join("config", "response_strategies.yaml")
|
| 20 |
+
with open(config_path, 'r', encoding='utf-8') as f:
|
| 21 |
+
response_strategies = yaml.safe_load(f).get('response_strategies', {})
|
| 22 |
+
|
| 23 |
+
def process_input(user_text):
|
| 24 |
+
try:
|
| 25 |
+
print(f"[DEBUG] Received user input: {user_text!r}")
|
| 26 |
+
|
| 27 |
+
# Step 1: EIL Processing
|
| 28 |
+
eil_result = eil_processor.infer_emotion(user_text)
|
| 29 |
+
print(f"[DEBUG] EIL Result: {eil_result}")
|
| 30 |
+
|
| 31 |
+
# Step 2: ESIL Formatting
|
| 32 |
+
esil_packet = esil_formatter.infer_esil(eil_result)
|
| 33 |
+
print(f"[DEBUG] ESIL Packet: {esil_packet}")
|
| 34 |
+
|
| 35 |
+
# Step 3: ERIS Reasoning
|
| 36 |
+
eris_result = eris_engine.reason_emotion_state(esil_packet)
|
| 37 |
+
print(f"[DEBUG] ERIS Result: {eris_result}")
|
| 38 |
+
|
| 39 |
+
# Pull out your primary_emotion_code (case-agnostic)
|
| 40 |
+
fam_code = (
|
| 41 |
+
eris_result.get('primary_emotion_code') or
|
| 42 |
+
eris_result.get('Primary Emotion Code') or
|
| 43 |
+
eris_result.get('Primary Emotion')
|
| 44 |
+
)
|
| 45 |
+
if not fam_code:
|
| 46 |
+
raise KeyError("`primary_emotion_code` missing in ERIS result")
|
| 47 |
+
|
| 48 |
+
# Lookup response strategy
|
| 49 |
+
rs = response_strategies.get(fam_code, {})
|
| 50 |
+
rsm_code = rs.get('rsm_code', 'RSM-UNKNOWN')
|
| 51 |
+
strategy_name = rs.get('strategy', 'Strategy not defined')
|
| 52 |
+
sample_response = rs.get('sample_response', 'No response available')
|
| 53 |
+
|
| 54 |
+
simulated_output = (
|
| 55 |
+
f"Response Strategy Code: {rsm_code}\n"
|
| 56 |
+
f"Response Strategy: {strategy_name}\n\n"
|
| 57 |
+
f"{sample_response}"
|
| 58 |
+
)
|
| 59 |
+
|
| 60 |
+
# Step 4: Fusion Engine – ensure keys match what FECController expects
|
| 61 |
+
final_uesp = {
|
| 62 |
+
# FEC expects these keys:
|
| 63 |
+
'emotion_family': eris_result.get('emotion_family', fam_code),
|
| 64 |
+
'Primary Emotion Code': fam_code,
|
| 65 |
+
**eris_result # Carry over everything else, in case
|
| 66 |
+
}
|
| 67 |
+
|
| 68 |
+
fusion_prompt = fec_controller.generate_prompt(final_uesp)
|
| 69 |
+
print("[DEBUG] Fusion Prompt Generated")
|
| 70 |
+
|
| 71 |
+
return fusion_prompt, simulated_output
|
| 72 |
+
|
| 73 |
+
except Exception as e:
|
| 74 |
+
print(f"[ERROR] Exception in process_input: {e}")
|
| 75 |
+
return "[ERROR: Unable to process input]", f"An error occurred: {e}"
|
| 76 |
+
|
| 77 |
+
def launch_ui():
|
| 78 |
+
with gr.Blocks(title="MEC MVP – Empathic UI") as demo:
|
| 79 |
+
gr.Markdown("# Master Emotional Core (MEC) MVP UI\nEmpathy-first AI. Built to protect.")
|
| 80 |
+
|
| 81 |
+
with gr.Row():
|
| 82 |
+
user_input = gr.Textbox(label="Enter your message:", placeholder="Type here...", lines=3)
|
| 83 |
+
|
| 84 |
+
with gr.Row():
|
| 85 |
+
submit_button = gr.Button("Process Input")
|
| 86 |
+
|
| 87 |
+
with gr.Row():
|
| 88 |
+
fusion_prompt_display = gr.Textbox(label="Fusion Prompt", lines=6, interactive=False)
|
| 89 |
+
empathic_response_display = gr.Textbox(label="Simulated Empathic Response", lines=8, interactive=False)
|
| 90 |
+
|
| 91 |
+
submit_button.click(
|
| 92 |
+
fn=process_input,
|
| 93 |
+
inputs=[user_input],
|
| 94 |
+
outputs=[fusion_prompt_display, empathic_response_display]
|
| 95 |
+
)
|
| 96 |
+
|
| 97 |
+
demo.launch()
|
| 98 |
+
|
| 99 |
+
if __name__ == "__main__":
|
| 100 |
+
launch_ui()
|
main.py
CHANGED
|
@@ -3,17 +3,21 @@ from core.esil_inference import ESILInference
|
|
| 3 |
from core.eris_reasoner import ERISReasoner
|
| 4 |
from core.hei_inference import HEIInference
|
| 5 |
from core.fec_controller import FECController
|
|
|
|
| 6 |
|
| 7 |
def run_pipeline(user_input_text, force_hei=False):
|
| 8 |
print("\n--- MEC MVP Test Run ---")
|
| 9 |
print(f"[Main] Pipeline Input: {user_input_text}")
|
| 10 |
|
| 11 |
-
#
|
|
|
|
|
|
|
|
|
|
|
|
|
| 12 |
eil = EILProcessor()
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
eil_packet = eil.process_eil(pre_packet)
|
| 17 |
print(f"[Main] EIL Packet Output: {eil_packet}")
|
| 18 |
|
| 19 |
# 2️⃣ ESIL Inference
|
|
@@ -40,6 +44,20 @@ def run_pipeline(user_input_text, force_hei=False):
|
|
| 40 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 41 |
print(f"[Main] Final Fusion Prompt:\n{fusion_prompt}")
|
| 42 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
elif esil_packet['confidence_score'] < 0.65:
|
| 44 |
print("\n[Main] Routing: escalate_to_hei")
|
| 45 |
# 6️⃣ Trigger HEI Inference (Fallback for Low Confidence)
|
|
@@ -57,12 +75,26 @@ def run_pipeline(user_input_text, force_hei=False):
|
|
| 57 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 58 |
print(f"[Main] Final Fusion Prompt:\n{fusion_prompt}")
|
| 59 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 60 |
else:
|
| 61 |
print("\n[Main] Routing: LLM Assist — Not implemented")
|
| 62 |
|
| 63 |
if __name__ == "__main__":
|
| 64 |
# Example input that matches a SAL Trigger:
|
| 65 |
-
test_input = "I
|
| 66 |
|
| 67 |
# Run in FORCE HEI mode → set to True to test Symbolic Layer
|
| 68 |
-
run_pipeline(test_input, force_hei=
|
|
|
|
| 3 |
from core.eris_reasoner import ERISReasoner
|
| 4 |
from core.hei_inference import HEIInference
|
| 5 |
from core.fec_controller import FECController
|
| 6 |
+
import yaml
|
| 7 |
|
| 8 |
def run_pipeline(user_input_text, force_hei=False):
|
| 9 |
print("\n--- MEC MVP Test Run ---")
|
| 10 |
print(f"[Main] Pipeline Input: {user_input_text}")
|
| 11 |
|
| 12 |
+
# Load Response Strategies YAML
|
| 13 |
+
with open('config/response_strategies.yaml', 'r', encoding='utf-8') as f:
|
| 14 |
+
response_strategies = yaml.safe_load(f)['response_strategies']
|
| 15 |
+
|
| 16 |
+
# 1️⃣ EIL Processor (handles both normalization and emotion processing)
|
| 17 |
eil = EILProcessor()
|
| 18 |
+
|
| 19 |
+
# Run emotion inference
|
| 20 |
+
eil_packet = eil.infer_emotion(user_input_text)
|
|
|
|
| 21 |
print(f"[Main] EIL Packet Output: {eil_packet}")
|
| 22 |
|
| 23 |
# 2️⃣ ESIL Inference
|
|
|
|
| 44 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 45 |
print(f"[Main] Final Fusion Prompt:\n{fusion_prompt}")
|
| 46 |
|
| 47 |
+
# 6️⃣ Simulated Empathic Response
|
| 48 |
+
fam_code = final_uesp['primary_emotion_code']
|
| 49 |
+
rsm_code = response_strategies.get(fam_code, {}).get('rsm_code', 'RSM-UNKNOWN')
|
| 50 |
+
strategy_name = response_strategies.get(fam_code, {}).get('strategy', 'Strategy not defined')
|
| 51 |
+
sample_response = response_strategies.get(fam_code, {}).get('sample_response', 'No response available')
|
| 52 |
+
|
| 53 |
+
simulated_output = (
|
| 54 |
+
f"Response Strategy Code: {rsm_code}\n"
|
| 55 |
+
f"Response Strategy: {strategy_name}\n\n"
|
| 56 |
+
f"{sample_response}"
|
| 57 |
+
)
|
| 58 |
+
|
| 59 |
+
print(f"\n[Main] Simulated Empathic Response:\n{simulated_output}")
|
| 60 |
+
|
| 61 |
elif esil_packet['confidence_score'] < 0.65:
|
| 62 |
print("\n[Main] Routing: escalate_to_hei")
|
| 63 |
# 6️⃣ Trigger HEI Inference (Fallback for Low Confidence)
|
|
|
|
| 75 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 76 |
print(f"[Main] Final Fusion Prompt:\n{fusion_prompt}")
|
| 77 |
|
| 78 |
+
# 8️⃣ Simulated Empathic Response
|
| 79 |
+
fam_code = final_uesp['primary_emotion_code']
|
| 80 |
+
rsm_code = response_strategies.get(fam_code, {}).get('rsm_code', 'RSM-UNKNOWN')
|
| 81 |
+
strategy_name = response_strategies.get(fam_code, {}).get('strategy', 'Strategy not defined')
|
| 82 |
+
sample_response = response_strategies.get(fam_code, {}).get('sample_response', 'No response available')
|
| 83 |
+
|
| 84 |
+
simulated_output = (
|
| 85 |
+
f"Response Strategy Code: {rsm_code}\n"
|
| 86 |
+
f"Response Strategy: {strategy_name}\n\n"
|
| 87 |
+
f"{sample_response}"
|
| 88 |
+
)
|
| 89 |
+
|
| 90 |
+
print(f"\n[Main] Simulated Empathic Response:\n{simulated_output}")
|
| 91 |
+
|
| 92 |
else:
|
| 93 |
print("\n[Main] Routing: LLM Assist — Not implemented")
|
| 94 |
|
| 95 |
if __name__ == "__main__":
|
| 96 |
# Example input that matches a SAL Trigger:
|
| 97 |
+
test_input = "I am feeling joy"
|
| 98 |
|
| 99 |
# Run in FORCE HEI mode → set to True to test Symbolic Layer
|
| 100 |
+
run_pipeline(test_input, force_hei=False)
|
mec_api.py
CHANGED
|
@@ -1,58 +1,52 @@
|
|
| 1 |
-
# mec_api.py
|
| 2 |
-
|
| 3 |
-
from core.input_preprocessor import InputPreprocessor
|
| 4 |
from core.eil_processor import EILProcessor
|
| 5 |
from core.esil_inference import ESILInference
|
| 6 |
-
from core.confidence_gate import ConfidenceGate
|
| 7 |
from core.eris_reasoner import ERISReasoner
|
| 8 |
from core.hei_inference import HEIInference
|
| 9 |
-
from core.uesp_constructor import UESPConstructor
|
| 10 |
from core.fec_controller import FECController
|
|
|
|
| 11 |
|
| 12 |
def run_mec_pipeline(user_input_text, force_hei=False):
|
| 13 |
-
# 1️⃣
|
| 14 |
-
pre = InputPreprocessor()
|
| 15 |
-
pre_packet = pre.preprocess_text(user_input_text)
|
| 16 |
-
|
| 17 |
-
# 2️⃣ EIL Processor
|
| 18 |
eil = EILProcessor()
|
|
|
|
| 19 |
eil_packet = eil.process_eil(pre_packet)
|
| 20 |
|
| 21 |
-
#
|
| 22 |
esil = ESILInference()
|
| 23 |
esil_packet = esil.infer_esil(eil_packet)
|
| 24 |
|
|
|
|
| 25 |
if force_hei:
|
| 26 |
esil_packet['confidence_score'] = 0.40 # force low confidence to trigger HEI
|
| 27 |
|
| 28 |
-
# 4️⃣ Confidence
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
# Routing logic:
|
| 33 |
-
if route['routing_decision'] == "proceed_to_eris":
|
| 34 |
eris = ERISReasoner()
|
| 35 |
-
|
| 36 |
|
| 37 |
-
elif
|
|
|
|
| 38 |
hei = HEIInference()
|
| 39 |
pseudo_esil = hei.detect_low_signal(esil_packet)
|
| 40 |
|
|
|
|
| 41 |
eris = ERISReasoner()
|
| 42 |
-
|
| 43 |
|
| 44 |
else:
|
| 45 |
# fallback (LLM Assist not implemented)
|
| 46 |
-
|
| 47 |
-
fusion_prompt = "N/A"
|
| 48 |
-
return fusion_prompt, "N/A", uesp
|
| 49 |
|
|
|
|
|
|
|
| 50 |
uesp_constructor = UESPConstructor()
|
| 51 |
-
final_uesp = uesp_constructor.construct_uesp(
|
| 52 |
|
|
|
|
| 53 |
fec = FECController()
|
| 54 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 55 |
|
| 56 |
-
# For Gradio frontend:
|
| 57 |
-
instruction_block = fusion_prompt #
|
| 58 |
return fusion_prompt, instruction_block, final_uesp
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
from core.eil_processor import EILProcessor
|
| 2 |
from core.esil_inference import ESILInference
|
|
|
|
| 3 |
from core.eris_reasoner import ERISReasoner
|
| 4 |
from core.hei_inference import HEIInference
|
|
|
|
| 5 |
from core.fec_controller import FECController
|
| 6 |
+
from core.codex_informer import CodexInformer
|
| 7 |
|
| 8 |
def run_mec_pipeline(user_input_text, force_hei=False):
|
| 9 |
+
# 1️⃣ EIL Processor (handles both preprocessing and emotion processing)
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
eil = EILProcessor()
|
| 11 |
+
pre_packet = eil.preprocess_text(user_input_text)
|
| 12 |
eil_packet = eil.process_eil(pre_packet)
|
| 13 |
|
| 14 |
+
# 2️⃣ ESIL Inference
|
| 15 |
esil = ESILInference()
|
| 16 |
esil_packet = esil.infer_esil(eil_packet)
|
| 17 |
|
| 18 |
+
# 3️⃣ Forced HEI Mode: Ensure it forces the low confidence path if True
|
| 19 |
if force_hei:
|
| 20 |
esil_packet['confidence_score'] = 0.40 # force low confidence to trigger HEI
|
| 21 |
|
| 22 |
+
# 4️⃣ Routing Logic based on Confidence Score
|
| 23 |
+
if esil_packet['confidence_score'] >= 0.65:
|
| 24 |
+
# 5️⃣ ERIS Reasoning (Final UESP Creation)
|
|
|
|
|
|
|
|
|
|
| 25 |
eris = ERISReasoner()
|
| 26 |
+
final_uesp = eris.reason_emotion_state(esil_packet)
|
| 27 |
|
| 28 |
+
elif esil_packet['confidence_score'] < 0.65:
|
| 29 |
+
# 6️⃣ Trigger HEI Inference (Fallback for Low Confidence)
|
| 30 |
hei = HEIInference()
|
| 31 |
pseudo_esil = hei.detect_low_signal(esil_packet)
|
| 32 |
|
| 33 |
+
# 7️⃣ Post-HEI Path: Continue to ERIS → UESP → FEC
|
| 34 |
eris = ERISReasoner()
|
| 35 |
+
final_uesp = eris.reason_emotion_state(pseudo_esil)
|
| 36 |
|
| 37 |
else:
|
| 38 |
# fallback (LLM Assist not implemented)
|
| 39 |
+
final_uesp = {"error": "LLM Assist not implemented."}
|
|
|
|
|
|
|
| 40 |
|
| 41 |
+
# 8️⃣ UESP Construction
|
| 42 |
+
# Assuming the UESPConstructor now works based on UESP data from ERIS or HEI
|
| 43 |
uesp_constructor = UESPConstructor()
|
| 44 |
+
final_uesp = uesp_constructor.construct_uesp(final_uesp)
|
| 45 |
|
| 46 |
+
# 9️⃣ FEC Controller (Final Fusion Prompt)
|
| 47 |
fec = FECController()
|
| 48 |
fusion_prompt = fec.generate_prompt(final_uesp)
|
| 49 |
|
| 50 |
+
# For Gradio frontend or API response:
|
| 51 |
+
instruction_block = fusion_prompt # You can later split instruction more if needed
|
| 52 |
return fusion_prompt, instruction_block, final_uesp
|