2015-05-13

AI circuit classifies images, becomes more humanlike.

Engineers at the University of California, Santa Cruz have taken a bold step forward with the development of a cutting-edge artificial intelligence circuit.Santa Barbara, CA — In what marks a significant step forward for artificial intelligence, researchers at UC Santa Barbara have demonstrated the functionality of a simple artificial neural circuit.


According to a new report in the journal Nature, the UC Santa Cruz team was able to use the artificial circuit to successfully perform a relatively complex task we do every day: classify images. For the first time, a circuit of about 100 artificial synapses was proved to perform a simple version of a typical human task: image classification. “It’s a small, but important step,” said Dmitri Strukov, a professor of electrical and computer engineering. Although everything from metal oxides, ferroelectrics, organic polymers, and even slime molds has been tried, no suitable substrate has been found for making memristor networks with the required precision.


With time and further progress, the circuitry may eventually be expanded and scaled to approach something like the human brain’s, which has 1015 (one quadrillion) synaptic connections. Researchers have now found a way to micro-fabricate complete neural networks without using CMOS technology at all — or for that matter, without any transistors. For all its errors and potential for faultiness, the human brain remains a model of computational power and efficiency for engineers like Strukov and his colleagues, Mirko Prezioso, Farnood Merrikh-Bayat, Brian Hoskins and Gina Adam. The secret is to build thin stacks of the same old oxide memristor material as before (aluminum and titanium) — only this time, use a low temperature sputtering process that enables monolithic three-dimensional integration.


That’s because the brain can accomplish certain functions in a fraction of a second what computers would require far more time and energy to perform. As you read, your brain sorts through the visual cues on the screen and processes them based on the surrounding context of the entire article and the redOrbit website.

A memristor, named for ‘memory-resistor,’ has an electrical resistance that depends on the history of the current that has flowed through the device — namely, how much current and in what direction. Unlike customary transistors, which depend on the drift and diffusion of electrons and their holes through semiconducting material, memristor functioning is by using ionic movement, comparable to the way human neural cells produce neural electrical signals. Key to this technology is the memristor (a combination of “memory” and “resistor”), an electronic component whose resistance changes depending on the direction of the flow of the electrical charge. By training such networks on a set of example patterns (such as letters), and tuning the weights of the ‘synaptic’ connections, additional patterns or letters can then be recognized.

The ionic memory mechanism brings several advantages over purely electron-based memories, which makes it very attractive for artificial neural network implementation, he added. “For example, many different configurations of ionic profiles result in a continuum of memory states and hence analog memory functionality,” he says. “Ions are also much heavier than electrons and do not tunnel easily, which permits aggressive scaling of memristors without sacrificing analog properties.” This is where analog memory trumps digital memory. Typically, this recognition process involves updating the network connections through successive iterations of some synaptic weighting function, until the network finally settles into a stable state. As a test, their network successfully classified 3 × 3-pixel black-and-white images into three classes, with the result indicated by the state of the three output neurons.

Potential applications already exist for this emerging technology, such as medical imaging, the improvement of navigation systems or even for searches based on images rather than on text. While these results are rather modest, the researchers anticipate that networks with a density of 100 billion synapses per square centimeter in each layer should soon be possible by shrinking the memristors down to 30nm across.

The energy-efficient compact circuitry the researchers are striving to create would also go a long way toward creating the kind of high-performance computers and memory storage devices users will continue to seek long after the proliferation of digital transistors predicted by Moore’s Law becomes too unwieldy for conventional electronics. “The exciting thing is that, unlike more exotic solutions, it is not difficult to imagine this technology integrated into common processing units and giving a serious boost to future computers,” said Prezioso. The very next step would be to integrate a memristor neural network with conventional semiconductor technology, which will enable more complex demonstrations and allow this early artificial brain to do more complicated and nuanced things. A perspective article accompanying the main paper claims that large memristor networks will do no less than ‘affect the future of computing.’ Others expect that massive memristor networks will enhance everything from laptops and phones, to mobile robots. Ideally, according to materials scientist Hoskins, this brain would consist of trillions of these type of devices vertically integrated on top of each other. Simple memristors won’t ever mimic the energy-lite features of brains (and therefore make for energy-lite machines) because the power of neurons is not their spikes, and the power of synapses is not their weights.

The power is everything else neurons do — all the intangibles that shape-shifting cells constantly pull off to rebuild networks anew each day, minute, and second, as they self-evolve through time to become ever-smarter. As we argued recently in our post on subcellular information processing in cytoskeletal networks, each neuron is itself a universal machine (or at least the once-feral descendant of a protist that was), and is fully capable of what we might call the 4F behavioral suite — feeding, fighting, fleeing, and fornicating.

The idea that the brain is spending all this energy (20 watts or so) just to compute by integrating electrical pulses and sending along the result to its neighbors is no longer tenable. Furthermore, while there have been some amateurish attempts to try to convert spikes into bits, and bits into spikes, futuristic ‘neuristor’ networks that can do something useful using spikes are nowhere on the roadmap of computing concepts. For example, one new kind of memristor based on amorphous SrTiO3 constructed using low (room) temperature niobium doping has just been shown to have superior switching and energy efficiency.

Show more