Mind At Light Speed - A New Kind Of Intelligence
- Dear Friends,
Love and Light.
Mind at Light Speed - A New Kind of Intelligence January 31st, 2006 Thomas Herold-->by Emil Venere Print Email
Advanced optics such as lasers, crystals and holograms may work in concert with quantum theory to revolutionize computers in this century, promising tremendous speed and abilities that exceed the human brain, according to a new book.
Computers created within the next two decades could revolve around a technology in which laser beams converge inside crystals the size of sugar cubes, forming holographic images for processing huge amounts of information, says the author, Purdue University physics Professor David D. Nolte.
In his book, "Mind at Light Speed: A New Kind of Intelligence," Nolte describes how optics-based computer technologies may evolve over three generations during the next century.
The first generation, which is well under way, has seen the Internet transformed by fiberoptic cables, optical switches and other devices that are based on photonics, or using visible light signals to transmit data.
The second generation, perhaps by the year 2020, will revolve around new types of optical processors. These "holographic computers" might use crystals that receive and manipulate data-laden images, processing information much faster than conventional computers.
Perhaps after 2050, a network connecting such computers might achieve intelligence, Nolte says.
"Imagine luminous machines of light made from threads of glass and brilliantly colored crystals that glow and shimmer, pulsating to the beat of intelligence," Nolte writes in the epilogue of this book.
Photonic applications in computing could extend far beyond the Internet, he says.
"One example is a machine that is able to sift through faces in a crowd to look for terrorists."
The third generation, possibly during the second half of the 21st century, could use "quantum optical" technologies to create computer networks capable of solving problems that currently are "uncomputable."
Photonics is the optical equivalent of electronics:
Instead of using electrons to transmit and process information, photonics uses photons, or tiny units of light.
Conventional computers transmit and process pieces of information in serial form, or one piece at a time. However, future computers may use "parallel" streams of data, allowing transmission of images billions of pieces of information all at once instead of one piece at a time, Nolte says.
The result would be much faster networks and computers.
"Our lives are filled with images, so why not have machines that operate on images as their principle unit of information?" he says. Nolte has written about 130 peer-reviewed journal articles and specializes in "dynamic holography," in which crisscrossing beams of laser light are used to create and control images.
The evolution of optical computer applications has already begun.
"Optoelectronic" technologies now use lasers to transmit data on glass fibers.
However, conventional optoelectronics has a major drawback. First, electronic signals have to be converted into light pulses for transmission over the Internet. Later, at the opposite end of the transmission, the light pulses must be converted back into electronic signals, a series of ones and zeros called binary digits, or bits, which the computer can understand.
"In optoelectronics, all of the control, all the intelligence, is in the electronics, in the transistors and the software driving it," Nolte says. "What we really want to do is get rid of some of those electronic circuits in the middle so that you can actually have light acting on light. If you could have light staying in the optical domain, its a lot faster, a lot more efficient, and it makes a lot more sense."
The human eye is a good model for an optical machine, he says.
"Light is an intrinsically parallel data structure," Nolte says. "Your eyes have a huge data-receiving capacity. Streaming into your eyes right now is about a gigabit, or a billion bits of information, per second."
But conventional technologies cannot easily supply the transmission capacity of the human eye. For images to be transmitted over the Internet, they must first be converted into a series of pixels, or picture elements. The pixels are then transmitted in serial form, one at a time, until the image has been received at the other end of the line.
The Advantage of Fiberoptics
Advances in fiberoptics have made it possible to transmit data in several wavelengths of light simultaneously along the same fiber, through a technology called wavelength division multiplexing. Currently, such channels can transmit 16 or 32 color channels at the same time. But the data are still sent serially as bits on each color channel, limiting the Internets speed.
"The eyes need or want to see a gigabit per second. That means your eyes want to have a gigabit coming out of every computer screen, and then you are talking huge data rates on network trunk lines. You are talking petabits of data rate, or a one with 15 zeros after it, on the trunk lines."
That is more than a thousand times faster than current Internet technologies can handle on a single fiber.
Reaching such astonishing capacities will require not only fiberoptic lines that can transmit a thousand colors at the same time, but also a whole bevy of photonic devices, such as switches and processors in which light controls light inside transparent crystals.
"There are no such fully optical switches yet in the Internet, but they are being developed in laboratories," Nolte says.
The crystal switches will reduce the need to convert light pulses into electronic signals, but it is unlikely they will ever entirely replace electrons inside computers.
"The optical image processors probably will not replace more conventional processor chips altogether but will work alongside electronic microprocessors," he says. "People expect that optics is going to replace chips like Pentium processors. Thats wrong. Electronics will always be the simplest way of doing computation and control."
Transmitting Data with Photons
Photons, on the other hand, are far better than electrons at transmitting data over long distances, such as network lines. They also could be used inside of computers to process and transmit information. Specialized photonic components operating on images would be capable of processing a million bits of data in the same amount of time that electronic circuits can process 32 bits.
"I am talking about machines that have optical processors that keep the image always in the image domain so that it never gets chunked up into pixels," Nolte says. "Then the image becomes the unit of information instead of the measly bit."
Photons also would be ideal for possible future technologies in "quantum computing."
Todays computers work by representing information as a series of ones and zeros, in a code that is relayed by transistors, which are minute switches that can either be on or off, representing a one or a zero, respectively.
Quantum computers would take advantage of a strange phenomenon described by quantum theory: Objects, such as atoms or electrons, can be in two places at the same time, or they can exist in two states at the same time. That means computers based on quantum physics would have quantum bits, or "qubits," that exist in both the on and off states simultaneously, making it possible for them to process information much faster than conventional computers.
A string of quantum bits would be able to calculate every possible on-off combination simultaneously, dramatically increasing the computers power and memory.
"You can do a lot of great quantum information processing with photons because they are so easy to work with," Nolte says. "What you gain with quantum computing is the ability to solve problems that are unsolvable otherwise. You are talking about processing capabilities that exceed human comprehension."
A future Internet of quantum computers might be able to "self-organize" itself into an intelligent network of nodes.
"This would be something that really looked a lot like a quantum brain," Nolte says.
[Non-text portions of this message have been removed]