Notes on Singularity
For those who espouse to the Big Bang, then you would by default also agree that this occurred 13.75 billion years ago. It didn’t take long until atoms began to form, that is, where the nuclei—protons and neutrons—attracted roaming electrons to orbit them. Atoms were the essential catalyst because as soon as they started coming together as molecules, it was the element carbon with its unique capacity to form bonds in four directions that led to life.
So about 10 billion years later from the Big Bang, carbon-based molecules became three-dimensional, and complex aggregations of carbon compounds were able to produce self-replicating mechanisms. Life is inevitable at this point, there’s no turning back.
The real question is what facilitated us becoming what we are today. DNA was the digital recorder of evolution’s experiments which enabled organisms to develop brains and nervous systems. The human brain’s propensity for abstract thought in tandem with our thumbs has created technology which has catapulted our existence far ahead of every other living thing. This is where we are now.
Technology, though, has ironically revealed a huge gap between its evolution and our evolution. There’s no arguing the rate of progress in technology leaves biological evolution in the dust. Being a skeptical reader of Ray Kurzweil, I do agree with his prognosis that technology is doubling its computation capacity—memory and speed—every year. The human brain, in contrast, is believed to add 1 cubic inch of grey matter every one-hundred thousand years. We all know brain size is not the ultimatum on intelligence because if that were the case Moby Dick (sperm whale) would have written Herman Melville’s classic. Try to bear with my cheesy allusions: I just can’t help myself.
Yes, this is where we are now, but where are we going? Notice how I italicized “self-replicating mechanisms” above, because I think this is a very important idea for leading thinkers today. Stephen Wolfram is one of those leading thinkers who demonstrated that a simple process can create more complicated processes—namely, cellular automaton (rule 110 class 4 automata). I know what you’re thinking, “What the f***!?” It’s just an algorithm that seemingly defies what an algorithm is able to do. An algorithm is a finite system which implies it can be fully accounted for, i.e., predicted. Wolfram’s rule 110 yields complex patterns that are neither predictable nor random. His argument would like to prove that all complexity in the world is derived from simple computations. Sounds nice but let me explain why it doesn’t hold up.
Self-replicating is important because of its association with DNA. When cells divide in our body, DNA is replicated. If you say evolution is the process of creating patterns of increasing order, then cellular automaton is important for it turns out unpredictable patterns with one exception. Its complexity scale in patterns is limited. Although it takes a long time, the human body continuously evolves—larger cerebral, abstract thought, etc.—while the cellular automaton has a limit. You could do a billion iterations, and once the cellular automaton reached its maximum complexity, it would remain there.
It should be mentioned that Jon Von Neumann was the predecessor to cellular automaton with his universal constructor. He was probably the greatest mind in the 20th century outside of Einstein. Oppenheimer is in general given the credit for the minds behind creating an atomic bomb, but Neumann was also an indispensable character in the same project, known as the Manhattan project. He made extensive trips to Los Alamos, New Mexico, where the covert project was happening. The Manhattan project will be left for another day though.