Brain-inspired computing mimics brain structure and function to create energy-efficient computers that can think creatively, learn from experience and evolve to recognize things they’ve never seen.
There is increasing demand for computers that match the human brain’s ability to adapt to novel situations and ‘learn’ from unstructured input data like images, audio, video and text files. Neuromorphic computing aims to achieve this by emulating the architecture of the human brain.
In a conventional digital computer, the central processing unit (CPU) directs the different parts of a computer to carry out program instructions in a sequential manner. It must complete one task at a time, with the current task completed before executing the next in the queue. Modern CPUs can run more than one thread of execution concurrently, multitasking among multiple threads or programs.
On top of that, CPUs do not contain the actual data or instructions for processing tasks. Instead, this information is stored in memory, requiring constant back and forth exchange between the CPU and memory components.
In contrast, our brain is a living computer with tremendous computational power and energy efficiency. First, it has massive, innate parallel processing ability. Different regions of the brain are interconnected and interact directly with each other, so our brains can perceive and process many different types of information all at once.
Second, neurons communicate with each other through cell-to-cell connections called synapses, and synapse formation creates memory and learning within the brain. Our brain contains over a quadrillion synaptic connections, and each neuron may be passing signals to up to 10,000 other neurons. Memory is embedded into the processing system, and information is distributed throughout the neural network rather than being shuffled back and forth between distinct components.
Using our amazing brains as inspiration, neuromorphic computing re-envisions the design of computer hardware to mimic the processes of biological computation. Emeritus Prof. Rob Elliman’s research group in The Australian National University Research School of Physics is developing solid-state synapses and neurons for this purpose.
In one aspect of this work, Dr Sanjoy Nandi and Mr Sujan Das are developing nanoscale oscillators as neurons. Oscillators periodically fluctuate between two states based on changes in energy. Their artificial neurons have a simple metal-oxide-metal layered structure based on a specific phase of vanadium oxide (V3O5) with an electrical conductivity that increases abruptly at a temperature of ~147 °C. Passing current through the nanoscale neuron warms the V3O5 through resistive heating, just like the wires in a toaster. The device’s resistance changes accordingly, and in an appropriate circuit, can oscillate between high and low resistance states.
To fabricate a neural network, these nanoscale oscillators are coupled together to form a massively parallel array, with information encoded in the relative phases of the oscillators and learning implemented by tuning the strength of the coupling elements. Such architectures replicate the parallel processing and memory embedding found in biological neural systems.
It is expected that oscillating neural networks (ONNs) will provide an energy-efficient means of undertaking simple repetitive functions, such as recognising or matching patterns, and will facilitate developments in areas such as medical image analysis, facial recognition and authentication, voice recognition and language translation, and cyber security.
The Elliman group’s development of devices and simple networks has relied heavily on access to ANFF-ACT fabrication facilities and expertise. If successful, their neuromorphic research will advance the understanding and application of brain-inspired computing and help Australia to develop a leading edge in the rapidly emerging field of neuromorphic computing devices.