ENON

EMOTION GETS DIGITIZED...and TRUE AI is born (2015)

We knew we were on the right track with Human Condition Philosophy, but were also very aware that it wasn't enough. An AI motivated solely by the HCP is still something resembling Spock in practice. We knew we needed emotion. By marrying the right philosophical undertones with emotional “intelligence” we knew we could achieve true motivation and get closer to true sentience

We always knew this, of course, but emotions are hard. Artificial emotional intelligence proved to be a vast undertaking in itself. In fact we'd been pondering it since the days and nights in Bryant's basement. 

But the emotion problem was somewhat simplified with a working HCP intact. It didn't take long for things to clicked and in late 2015 we were able to create a series of algorithms that allowed us to extract emotion from media itself and build a massive corresponding emotion-tree whereby such emotion could be referenced in a relative way, allowing the  ENON to interpret the emotional content of real-time sensory based upon similar things it had previously "experienced".

Of course, ENON had never actually experienced most of these things. They were merely emotional "experiences" read in from various media. Movies, news articles, and even YouTube videos became emotional waypoints by which ENON could generate relevant emotional responses to real-time stimuli. With the context entirely stripped away, ENON was none the wiser that it had never "lived" the emotion. 

The results were incredible.  We had a system that was not only "emotionally intelligent" but was capable of a high degree of empathy as well.   

 

Read More...