Off the wire
Canadian stock market closes at record high  • Portugal's PAN party submits draft bill on euthanasia to parliament  • Ireland's employment continues increase  • U.S. stocks close at record highs amid earnings reports  • UC Berkeley, Google to crowdsource images of solar eclipse for megamovie project  • Gambia detains intellegence chiefs  • Strike at Chile's copper mine drags on  • Italian ruling PD party weakens as minority faction takes split step  • Roundup: UN refugee agency focuses on aiding displaced as Iraqi offensive moves to west Mosul  • Nearly quarter of Austrian school students speak foreign conversational language: report  
You are here:   Home

Low-energy artificial synapse created for neural network computing

Xinhua, February 22, 2017 Adjust font size:

Researchers at Stanford University and Sandia National Laboratories have created an artificial version of the space over which neurons communicate, called a synapse, that could help computers mimic one piece of the human brain's efficient design.

The new organic artificial synapse, intended to recreate the low-energy processing of the human brain, mimics the way synapses in the brain learn through the signals that cross them.

And unlike traditional computing, which involves separately processing information and then storing it into memory, its processing creates the memory.

When we learn, electrical signals are sent between neurons. The most energy is needed the first time a synapse is traversed. Every time afterward, the connection requires less energy.

This is how synapses efficiently facilitate both learning something new and remembering what we've learned. The artificial synapse, unlike most other versions of brain-like computing, fulfills these two tasks simultaneously, and does so with substantial energy savings.

"It works like a real synapse but it's an organic electronic device that can be engineered," said Alberto Salleo, associate professor of materials science and engineering at Stanford and senior author of a paper published this week in Nature Materials, about the artificial synapse.

"It's an entirely new family of devices because this type of architecture has not been shown before. For many key metrics, it also performs better than anything that's been done before with inorganics."

Based on a battery design, the device consists of two thin, flexible films with three terminals, connected by an electrolyte of salty water.

It works as a transistor, with one of the terminals controlling the flow of electricity between the other two. Like a neural path in a brain being reinforced through learning, the researchers program it by discharging and recharging it repeatedly.

Through this training, they have been able to predict within 1 percent of uncertainly what voltage will be required to get the synapse to a specific electrical state and, once there, it remains at that state.

In other words, unlike a common computer, where you save your work to the hard drive before you turn it off, the artificial synapse can recall its programming without any additional actions or parts.

Every part of the device is made of inexpensive organic materials largely composed of hydrogen and carbon. Cells have been grown on these materials and they have been used to make artificial pumps for neural transmitters.

The voltages applied to train the artificial synapse are also the same as those that move through human neurons.

All this means it's possible that the artificial synapse could communicate with live neurons, leading to improved brain-machine interfaces, said the researchers in a news release. In addition, the softness and flexibility of the device also lends itself to being used in biological environments.

Before any applications to biology, however, the team plans to build an actual array of artificial synapses for further research and testing.

"Deep learning algorithms are very powerful but they rely on processors to calculate and simulate the electrical states and store them somewhere else, which is inefficient in terms of energy and time," said Yoeri van de Burgt, former postdoctoral scholar in the Salleo lab and lead author of the paper.

"Instead of simulating a neural network, our work is trying to make a neural network." Endit