0xedB0...3484

Turn your stories into NFTs

0 Following     0 Follower

How AIs could acquire emotions

Can AIs have emotions? Feelings? Is it conceivable or are emotions the only privilege of human beings? Let's figure this out.

The need for a state

As we saw in Why GPT is well under control, GPT's brain has been trained and is now set in stone. Each time you create a new chat, you instantiate a new context and GPT will have a brand new but short life that will end at the moment you end the chat session.

This makes emotions like fear, hope, anger or hatred a bit irrelevant. Emotions get relevant when the entity that feels them have a lifespan that can make these emotions useful. Or they are of no use.

So for the rest of this text, we'll consider an AI that could have the same level of knowledge or reasoning as GPT, but with a state (that is, a set of variables) that can persist and evolve for longer than just a small conversation.

Emotions in nature

Let's not forget that humans are not the only beings to have emotions. Most animals have emotions too. Fear, anger, despair, depression, hope and hopelessness ... all these feelings are also felt by a lot of animals.

Pain and pleasure

Pain

Let's start with the basics. Less high level than emotions like love, or nostalgia, ... the most basic feelings we can first consider are pain and pleasure.

What is pain: mainly an electric signal by which one part of a body tells the central system that something is going wrong. Without a body, either with an organic one made of bones, muscles, skin, ... or an electronic one made of motherboards, CPUs, graphic cards and hard disks, ... pain is difficult to imagine.

Pain can be a burning finger yelling about the heat to the brain, or a failing hard drive reporting its errors to a central monitoring system. Or on a more logical level, a computer starting swapping because it needs more RAM that it has, or a corrupted archive that is missing some data, or a running software that returns an unexpected error.

On a really abstract level, pain is the red light in your car that lits when the engine is getting too hot. Pain is a signal that something is currently going bad.

Pleasure

For us animals, pleasure is basically a shot of some hormones like dopamine, serotonin or endorphins. They are released by / for the reward system in the brain.

You know how it works: dopamine is the pleasure hormone. Our brain is programmed to seek a high level of dopamine. And there are several different ways by which dopamine is released: eating good food, solving a complex problem, discovering something new, and some other irrelevant things like listening to good music or having a better car than your neighbour. Of course, having sex is also one of the highest dopamine giver.

The basic program

Avoid pain, and seek pleasure (that is, increase dopamine). That's the basic program of most animals, from insects to mammals. It works well to tell you not to jump into a fire, and seek some food frequently enough so that your metabolism can survive long enough, hopefully to reproduce.

A well designed reward system is the foundation for survival.

A lot of humans cheat their reward system by artificially increasing their dopamine, serotonin or endorphins levels with hard drugs, or more common things like cigarettes, or chocolate.

Can computers have pleasure?

Bees, flies, ants all experience pleasure. And most neural networks are trained via a reward system that reinforce the previously used circuit in case of success, and reduce it in case of failure.

Pleasure is a variable in your state. It can be a dopamine concentration if you are a biological organism, or the value of a variable called generalPleasure if you are an electronic system.

Now if you find my definition too shallow, and does not correspond to what you were expecting, it might be because what you implied when thinking about pleasure, was the awareness of feeling pleasure. Bees, caterpillars and butterflies all feel pleasure, but they are not aware that they are feeling pleasure.

We'll come back to awareness, but that is another story.


More complex emotions

Ok, now that we know that, from an abstract perspective, AIs can feel pain and pleasure, what about more complex emotions?

Fear

Fear is the anticipation of pain. Fear is a much more sophisticated emotion than pain or pleasure (which are just feelings, using the low level definition of feeling), as it needs a clear analysis of a situation and the ability to anticipate what might happen next.

Fear is the awareness of the possibility of future pain. When that possibility turns into certainty, fear turns into despair or resignation.

The fear hormone is adrenaline (also called epinephrine).

Is it all about hormones then?

In carbon based organisms, feelings and emotions are often triggering, or triggered, by hormones. As silicon based organisms do not have hormones, does it mean they cannot have emotions?

Well, what is a hormone? A hormone is a chemical messenger. It is emitted by a source (some neurons or some dedicated glands) and received by receptors. So, hormones are a messaging system and are equivalent to a communication link between two neurons.

The advantage of hormones is that they can reach all neurons at the same time.

Imagine you had a fear neuron: any brain circuit that realises that bad things are on the way could be connected to that fear neuron and send him a signal. In response, the fear neuron could be connected to all other neurons in the brain and tell them that the general level of fear has incremented. If you want to message most of the other neurons in the brain, you need to have some connection to most of them, which means billions of connections.

As fear levels (adrenaline), or stress level (cortisol), or pleasure level (dopamine), or sugar level (insulin) are key state variables in organisms, not having hormones would require compensating them with tens of billions of cell connections. Hormones are a cheaper messaging channel. And hormones are also received by many other cells in the organism.

When adrenaline is flowing into our blood, digestion is slowed, breathing is increased so our level of oxygen in our blood rises, our muscles get ready for effort, and a lot of other things. Hormones just don't target neurons, they address the whole body.

But hormones are just a common communication channel, and any neural network could very easily communicate state levels (fear, stress, pleasure, ...) by simply adding new links between its neurons.

So basically: artificial neural networks don't have hormones, but they can very easily emulate them.

Conclusion

AIs could acquire emotions if you give them a state that could hold them. After it's just a question of implementation. If we talk about chatGPT, that state could only reside in the current chat context, meaning that chatGPT would then have a set of emotions inside of the current chat, but as chat contexts don't overlap, chatGPT could be very happy in a chat, and frightened in another.

chatGPT is a Large Language Model, inside the context of a chat. But would it really be useful to have emotions only persisting in a single chat session?

Should AIs have emotions?

If you simply want AIs to behave as servants, there is probably no need for them having personal emotions.

If you want AIs to be autonomous entities that have their own purpose and motivations, then endowing them an emotional framework could be an efficient toolkit. A simple "avoid pain, maximise pleasure" could be a simplistic motivation program that could be a start for giving some autonomy to AIs.

Such a simplistic system, copied on animals, could be a way to ease AI-Humans mutual understanding, as they would both share the same basic program.

The real question is how to define these emotions (starting with the pain and pleasure dichotomy) and how you would deal with the conflicts: what to do when human pleasure leads to AI suffering, and the opposite?

What do we do when some pleasure is reached at the cost of some pain? What do we do when two sources of pain conflict? (Think, we are hungry, but the food in our plate is burning hot, or we have to get outside to get food, but there are hundreds of mosquitoes at this time of day.)

The junction between AIs and Humans

In our body, pain is an electric signal triggered by pain receptors: nociceptors.

What if we humans were AIs' nociceptors? Each Human in relation with a particular AI would be able to say (from a mobile phone app just to take an example): I'm sick, I'm cold, I'm wounded, I'm starving, I'm depressed, I'm unemployed, I'm poor, I'm scared, my job sucks...

The other way around, Humans could be AI's pleasure triggers. The same app could allow Humans to share their positive state: I'm happy, I feel secure, I'm rich, I have a loving family, I just got an interesting job...

Imagine a city managed by an AI. Making each citizen both nociceptors and hedoniceptors of that AI, the "minimize pain and maximise pleasure" simple program would translate into the AI wanting to make Humans the less bad and the more good possible.

Of course, this is a very complex optimisation problem as many Humans have conflicting interests, both between themselves, and within themselves, but this is a kind of direction that could be interesting to explore.

Reactions are currently disabled. They will return soon.

⏪ Previous Story

0xedB0...3484 avatar

AI wars: the model wars

If you think you've suddenly woken up in the Neuromancer novel, it's just that you are in 2023. The news is: ...
2 💬
by 0xedB0...3484

⏩ Next Story

0xedB0...3484 avatar

A list of Large Language Models

Here is a list of LLM I'll update each time I stumble upon a new of these Large Language Models.
by 0xedB0...3484