The burgeoning field of neuromorphic computing – arguably led by Intel’s “Loihi” chip – leverages virtualized neurons to mimic the behavior of the brain. Typically, the “brain” referred to by that sentence would be a human brain – but what if it weren’t?
“Traditionally we look at brain-inspired computing and we think of us – humans – and how complex our brain is and the tremendous capabilities that it has,” explained Angel Yanguas-Gil, principal materials scientist for Argonne National Laboratory, in a webinar yesterday. “What we have done at Argonne is to take a step back and look, not just at humans, but other types of inspirations that can help us make systems that are essentially capable of doing this type of smart learning.”
“And in particular,” he continued, “one of the most promising things that we have found is insects.”
A beeline to neuromorphics
The need for neuromorphic computing, Yanguas-Gil explained, stemmed from core limitations in the way current AI algorithms operate.
“Once you have a trained system, that system can be deployed – say on your smartphone, or in a chip,” he said. “But once that system is deployed, it’s static. If there are changes or perturbations in the environment that the system needs to respond to, it cannot do it unless it falls within the dataset that it has been trained for. That’s very different with respect to how we operate. We – and in general, humans and animals – we have a tremendous ability to learn on the fly, to react to new information and adapt to changes in the environment.
“What you would like,” he continued, “is to have a system that is able to recognize that something has changed and, with a few examples, be able to adapt and recover.”
Insects, Yanguas-Gil said, weren’t just an inspiration for compact AI: insects like bees behaved like smart sensors that were capable of operating a noisy environment, gathering information and – crucially – adapting to that information. “That,” he said, “is the type of flexibility that we’ve been very interested in.”
Some of this flexibility, he said, stemmed from the compactness: some insects, for example, are able to take greater advantage of their scarce neurons by contextually adapting the functionality of their brain connections.
Argonne’s pincer maneuver
Yanguas-Gil explained that Argonne’s research into insect neuromorphic computing was a two-pronged approach. First, the mathematics of it all, beginning with an exploration of the state-of-the-art research in insect neuroscience and insect behavior and working toward extracting the mathematical principles that led to insects’ performance.
“Once we have those mathematical principles, we can run them in the same way that you would run machine learning or an AI algorithm,” Yanguas-Gil said. Then, they take those networks and compare them to benchmarks in machine learning – and, in particular, the subfield of continual learning. “It turns out that even though they are very small, and very nimble, they can perform as well in some of the tasks as state-of-the-art algorithms that are out there,” he said.
Second: with those algorithms in-hand, Yanguas-Gil said, “you want to port them into hardware.” He outlined the three hardware approaches that Argonne was exploring to take advantage of insect neuromorphic computing, which included research on off-the-shelf devices like FPGAs with collaborators like the University of San Antonio and work with state-of-the-art neuromorphic chips like Intel’s Loihi.
Designing a stronger shell
“The final thing, though, is that we can take these ideas and figure out … how we would change the way we design chips with novel materials,” he said. Yanguas-Gil outlined how the researchers were leveraging Argonne’s “extremely strong program” in the field of atomic layer deposition – a thin-film technique used in semiconductor production – to conduct advanced co-design research for neuromorphic computing.
“We use [the neuromorphic application] as a target to simultaneously design not just the architecture … but [to identify] what type of novel materials we need, or how to best integrate the materials that we already have, into this architecture to optimize the ability to learn in real-time.”
Some of this research, Yanguas-Gil said, was targeted at making those platforms more resilient to extreme environments.
“We have found that the combination of these novel materials with other non-silicon platforms – like silicon carbide – can help you maximize the amount of compute that you can carry out while minimizing that number of components that are needed,” he said, “which is something really important when you move to temperatures as high as 300 to 400 degrees Celsius – and even high-radiation environments.” One material that Argonne developed years ago, he said, allowed for tuning of resistivity across “many orders of magnitude” and was resistant to temperatures up to 500 degrees Celsius.
Yanguas-Gil sees several applications for these types of ultra-compact, adaptable, high-resilience neuromorphic hardware, mentioning self-driving cars (“If you want the vehicle to react to a change that it hasn’t been trained for without failing catastrophically, that’s one application”) as well as the brain-computer interfaces used to control prosthetics.