Tuesday, January 26, 2016

How far can artificial intelligence go?

I. J. Good and Vernor Vinge noted that if humans could produce smarter than human intelligence, then so could it, only faster. Good called this phenomena an intelligence explosion. Vinge called it a singularity. Ray Kurzweil extends Moore's Law to project that global computing capacity will exceed the capacity of all human brains (at several petaflops and one petabyte per person) in the mid 2040's. He believes that a singularity will follow shortly afterward. This assumes that global computing capacity (operations per second, memory, and network bandwidth) continues to double every 1.5 years, as it has been doing since the early 20'th century.

Current global computing capacity is about 10^19 operations per second (OPS) and 10^22 bits of memory, assuming several billion computers and phones. In 30 years, these should increase by 6 orders of magnitude. Ten billion human brain sized neural networks with 10^14 connections each at a few bytes per connection, running at 100 Hz, would require roughly 10^26 OPS and 10^26 bits.
It is hard to predict what will happen next because our brains are not powerful enough to comprehend a vastly superior intelligence. Various people have predicted a virtual paradise with magic genies, or a robot apocalypse, or advanced civilization spreading across the galaxy, or a gray goo accident of self replicating nanobots. Vinge called the singularity an event horizon on the future. We could no more comprehend a godlike intelligence than the bacteria in our gut can comprehend human civilization.
Nevertheless, physics (as currently understood) places limits on the computing capacity of the universe. Flipping a qubit in time t requires energy h/2t, where h is Planck's constant, 6.626 x 10^-34 Joule seconds. Seth Lloyd, in Computational capacity of the universe, estimates that if all of the mass of the universe (about 10^53 Kg) were converted to 10^70 J of energy, it would be enough to perform about 10^120 qubit flip operations since the big bang 4 x 10^17 seconds ago (13.8 billion years). This value roughly agrees with the Bekenstein bound of the Hubble radius, which sets an upper bound on the entropy of the observable universe of 2.95 x 10^122 bits.
Writing a bit of memory, unlike flipping a qubit, is a statistically irreversible operation, which requires free energy kT ln 2, where T is the temperature and k is Boltzmann's constant, 1.38 x 10^-23 J/K. Taking T to be the cosmic microwave background temperature of 3 K, the most we could store using 10^70 J is about 10^92 bits. This roughly agrees with Lloyd's estimate of 10^90 bits, which he calculated by estimating the number of possible quantum states of all 10^80 atoms in the universe.
If we restrict our AI to the solar system and captured all of the sun's output of 3.8 x 10^26 W using a Dyson sphere with radius 10,000 AU and temperature 4 K, then we could perform 10^48 OPS (bit writes per second). To put this number in perspective, the evolution of human civilization from dirt 3.5 billion years ago required 10^48 DNA base copy operations and 10^50 RNA and amino acid transcription operations on 10^37 DNA bases over the last 10^17 seconds. Thus, our computer could simulate the evolution of humanity at the molecular level in a few minutes, a speedup of 10^15. Anything faster would require interstellar travel or speeding up the sun's energy output, perhaps by dropping a black hole into it. (A naive extrapolation of Moore's Law suggests this will happen in the year 2160, 75 years after we surpass the computing power of the biosphere.)
Note: to estimate the computational power of evolution, I am assuming 5 x 10^30 bacteria with a few million DNA bases each, and a similar amount of DNA in other organisms. I am assuming a replication time of 10^6 seconds per cell, and that DNA replication makes up 1% of cell metabolism. See also An Estimate of the Total DNA in the Biosphere.

No comments:

Post a Comment