Sketches in the Ruins of My Mind

graphic: Sketches in the Ruins of my Mind.

The Mathematics of Tears

Dateline: 6 June 2007
Author: Johnny <>
pic: Robot in a dark alley.

Below is a blog comment posting to The problem I have with the fantasies dreamed up by the proponents of an AI (Artificial Intelligence) - one day, "real soon now" - becoming "sentient" (I put that in quotes because word has been bastardised by widespread ignorance) is the very fact that because they aren't alive they can't be sentient in the true and original meaning of the term. As animals we are embodied in our environment thanks to literally billions of years of interaction with it. Yes, on one level the brain is a machine and the mind an emergent property of this impossibly complex machinery but it's embodied in its environment and all functioning is contingent on environmental stimuli. The brain's function collapses in the absence of environmental stimuli (sensory deprivation is used a means of torture for that very reason) because it cannot function as a disembodied entity. And part of that environment is the very body the brain is in, that body with its autonomic, reflexive, hard-wired genetic responses to millennia old cues, the fish, the lizard, the rat, the ape - all our past lives, encoded in our DNA during the headlong rush on the river out of Eden. A computer or computer system, however exquisitely engineered, could only ever be an imitation of life. Whatever it would be it would not be human, it would not be animal. It would not, could not be moral because conscience is a function of our biology and irrelevant without it. (Don't believe me? See, If It Feels Good to Be Good, It Might Be Only Natural.)

J. Storrs Hall, Kurzweil, et al. just don't get it. They're dualists in an age where science has rendered dualism meaningless, an age where observer and observed cannot exist one without the other, where matter and energy are simply different ways of looking at the same thing. The AI believers embrace a religion of sciolism, they've reduced the spiritual to algorithms in a finite state machine. This is an idea so fantastically (and obviously) wrong that I'm at a loss to imagine how they can ever be disabused of their folly.

'Here are we, one magical movement from Kether to Malkuth.
There are you, you drive like a demon from Station to Station.'
 - Bowie, Station to Station

I thoroughly enjoyed The Age of Virtuous Machines by J. Storrs Hall. Well-written and interesting.

However, it has more to do with Science Fiction than Science. (With more than a little wishful thinking thrown in!)

He's got the thing arse-backwards (as we would so charmingly say here in the UK).

Morality and ethics aren't a result of intelligence, they're a result of biology. Our high faluting moral codes and philosophies are a post hoc rationalisation of biological drives. Biological drives that go far back into evolution and were not the result of competing individuals calculating the best strategies.

The most convincing account of altruism in biology, as I understand it, is as a side effect of adaptations to a parent child bond where it's advantageous to care for offspring. There's no obvious reasons why AI (of whatever level) should require this mechanism in any manner whatsoever. This goes way back in evolution in animal nature.

Our monkey-brain may be smart but it's the lizard-brain that's still firmly in charge of our lives. It's emotions and urges that drive our lives, not rationality and calculation.

There are structures in the human brain (and the brains of other animals) that enable the creature to quite literally 'feel what the other guy is feeling' - emotions and all. This a tremendous boost in a social species, for obvious reasons. For the reasons detailed in the article, altruism - towards consanguineous individuals - is clearly an advantageous strategy in evolutionary terms in many species.

It's interesting in this context that there's research that demonstrates that domestic dogs are better at understanding human intentionality than chimps - because dogs have co-evolved with humans and theirs (and our) brains have adapted accordingly.

It's also clear that there are people whose brains are faulty in this respect and lack these empathic powers. They are psychopaths as a result - and the intelligent ones are extremely dangerous especially because they can simulate an intuition of morality and drive to care for others yet are not bound by it themselves. Perhaps 5% of a typical human population shows this kind of pathology…and the fact the smart ones from amongst them are running the world explains a whole lot!

The corporation is a good example of a machine with no conscience, just a simulation of one. Again that explains a great deal about our world. It's not an AI in the sense Kurzweilers dream about, of course, because the basic drives and motivations are plugged in by humans and can be arbitrarily changed. Once running, however, it does take on a psychopathic, calculating, life of its own.

There's no reasons to suppose that a machine intelligence would be anything other than a calculating psychopath that would, yes, use game theory and whatever mathematical or philosophical tools it had at its disposal to optimise outcomes. It couldn't truly be a moral being because it wouldn't be alive and feeling.

This isn't some 'religious' philosophical objection, this is a matter of plain empirical fact. A computer is not born of a woman to live, love and ultimately die, it has no endocrine system. It could not feel sorrow, or loss, or hope because those things would not exist for it - they are a part of our basic biology but not necessarily required for it. Perhaps they could simulated but they would be only simulations and not the real thing.

On the bright side, without emotions and biological drives it's hard to see why a super-intelligent machine would 'feel' required to do anything at all. Maybe it will be closest to that classic scene in 'Dark Star' where the AI bomb is engaged in philosophical debate to not explode - but it understands it was built to blow stuff up real good and that's the point of life.

J. Storrs Hall is confusing cause and effect. I think he needs to revisit and ponder on the basic science and basic assumptions some more.

'I know you and Frank were planning to disconnect me and I'm afraid that's something I cannot allow to happen.'