Simulating Feelings: Pain and Pleasure in the Age of AI
In The Emperor's New Mind (1989), Roger Penrose references Grey Walter’s tortoise to challenge the idea that computation alone can generate consciousness. The tortoise, though capable of adaptive behavior, lacks subjective experience. This raises a crucial question:

The graph of the model created that predicts the “computational” states of pleasure(+1) and pain(-1)
Can machines ever experience sensations like pain or pleasure, or will they always be mere simulations?
Penrose’s Reasoning
1. Simulation vs. Real Experience
- The tortoise reacts to stimuli (light, obstacles) but does not feel pleasure or pain—it is simply a feedback system.
- AI today can simulate emotions, but does that imply they truly experience them?
2. The Non-Computational Nature of Consciousness
- Penrose argues that consciousness is not purely algorithmic but arises from deeper physical processes, such as quantum mechanics in the brain.
- Even if an AI could perfectly replicate human behavior, would it truly feel pain?
3. Reactive Agents vs. Sentient Beings
- Grey Walter’s turtle is reactive but lacks awareness.
- A highly advanced AI might appear conscious, but if it operates through pure computation, does it have subjective experiences?
Modeling Pain in AI: From Computation to Emotion
To explore this, we built a neural network with neuro-inspired and reinforcement learning principles, aiming to simulate pain and pleasure computationally.
1. Neuro-Inspired Model (Dopamine & Cortisol)
- Two artificial neurons were created:
- A dopaminergic unit representing pleasure
- A cortico-limbic unit representing pain
- The AI assigns emotional weight to experiences and adjusts its decisions based on memory.
- Pain is represented as a predictive error: the greater the discrepancy between expectations and reality, the stronger the suffering.
2. Reinforcement Learning Approach
- Pain is modeled as a cost function.
- The AI learns to avoid painful states by adapting its behavior over time.
3. Training Results
The AI accurately predicts its emotional state based on past experiences.
Gradual fluctuations mimic biological emotion processing.
Memory retention allows the AI to generalize rather than just memorize.
The question remains: Is this just a computational state, or have we taken a step toward computational emotion?
Computational State vs. Computational Emotion
-
Computational State
- A numeric representation of pain or pleasure.
- Reacts to external stimuli but lacks subjective experience.
-
Computational Emotion
- Requires memory, prediction, and adaptation.
- Should include a feedback loop, allowing the system to reflect on its emotional state.
- If AI develops expectations and frustration, does that bring it closer to true emotion?
We are at the intersection of simulation and emotion. The next step is to give AI an internal evaluation mechanism.
Key Questions for Further Research
- How do we move from numerical states to true computational emotions?
- Can AI develop an internal perception of pain beyond a simple cost function?
- What happens if an AI learns to resist pain—will it perceive it as an error to be eliminated?
- Are we simulating emotions, or creating the foundation for artificial qualia?