Summary: Researchers have developed a revolutionary synaptic transistor inspired by the human brain. This device can simultaneously process and store information, mimicking the brain’s ability to think at a higher level.
Unlike previous brain-like computing devices, this transistor remains stable at room temperature, operates efficiently, consumes minimal power, and retains stored information even when turned off, making it suitable for real-world applications.
The study represents a major breakthrough in creating AI systems with greater energy efficiency and advanced cognitive functions.
- The synaptic transistor combines two atomically thin materials, bilayer graphene and hexagonal boron nitride, in a moire pattern to achieve neuromorphic functionality.
- It recognizes patterns and demonstrates associative learning, a form of higher-level cognition, even with imperfect input.
- This technology represents a significant departure from traditional transistor-based computing, aiming to improve energy efficiency and processing capabilities for AI and machine learning tasks.
Source: Northwestern University
Taking inspiration from the human brain, researchers have developed a new synaptic transistor capable of higher-level thinking.
Designed by researchers at Northwestern University, Boston College and the Massachusetts Institute of Technology (MIT), the device simultaneously processes and stores information, just like the human brain. In new experiments, the researchers demonstrated that the transistor goes beyond simple machine learning tasks to categorize data and is capable of performing associative learning.
Although previous studies have exploited similar strategies to develop brain-like computing devices, these transistors cannot operate outside of cryogenic temperatures. On the other hand, the new device is stable at room temperature. It also operates at high speeds, consumes very little power, and retains stored information even when the power is turned off, making it ideal for real-world applications.
The study will be published Wednesday, December 20 in the journal Nature.
“The brain has a fundamentally different architecture than a digital computer,” said Mark C. Hersam of Northwestern, who co-led the research.
“In a digital computer, data flows between a microprocessor and memory, which consumes a lot of power and creates a bottleneck when trying to multitask.
“On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in much greater energy efficiency. Our synaptic transistor also performs simultaneous memory and information processing functionality to more closely mimic the brain.
Hersam is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He is also chairman of the Department of Materials Science and Engineering, director of the Materials Science and Engineering Research Center, and a member of the International Institute of Nanotechnology. Hersam co-led the research with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT.
Recent advances in artificial intelligence (AI) have prompted researchers to develop computers that work more like the human brain. Conventional digital computing systems have separate processing and storage units, causing data-intensive tasks to consume large amounts of energy.
As smart devices continually collect large amounts of data, researchers strive to discover new ways to process it all without consuming an increasing amount of energy. Currently, the memory resistor, or “memristor”, is the most developed technology capable of combining processing and memory functions. But memristors still suffer from energy-expensive switching.
“For several decades, the paradigm in electronics has been to build everything from transistors and use the same silicon architecture,” Hersam said.
“Significant progress has been made by simply integrating more and more transistors into integrated circuits. There is no denying the success of this strategy, but it comes at the cost of high energy consumption, especially in today’s big data era where digital computing is poised to overwhelm the network. We need to rethink computing hardware, especially for AI and machine learning tasks.
To rethink this paradigm, Hersam and his team explored new advances in the physics of moire patterns, a type of geometric design that appears when two patterns overlap.
When two-dimensional materials are stacked, new properties emerge that do not exist in a single layer. And when these layers are twisted to form a moiré pattern, unprecedented tuning of electronic properties becomes possible.
For the new device, the researchers combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When stacked and deliberately twisted, the materials formed a moire pattern.
By rotating one layer relative to the other, researchers could obtain different electronic properties in each graphene layer, even if they are only separated by atomic-scale dimensions. With the right choice of torsion, the researchers exploited moiré physics for neuromorphic functionality at room temperature.
“With twist as a new design parameter, the number of permutations is vast,” Hersam said. “Graphene and hexagonal boron nitride are very similar structurally but just different enough to achieve exceptionally strong moiré effects.”
To test the transistor, Hersam and his team trained it to recognize similar, but not identical, patterns. Earlier this month, Hersam introduced a new nanoelectronic device that can analyze and categorize data in an energy-efficient way, but its new synaptic transistor takes machine learning and AI a step further.
“If AI is supposed to mimic human thinking, one of the lowest level tasks would be to classify data, which is simply sorting it into bins,” Hersam said. “Our goal is to advance AI technology toward higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, which is why we tested our new devices in more complicated conditions to verify their advanced capabilities.
The researchers first showed the device a pattern: 000 (three zeros in a row). Then they asked the AI to identify similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows that 111 is more like 000 than 101,” Hersam explained. “000 and 111 are not exactly the same, but both are made up of three consecutive digits. Recognize that similarity is a form of higher-level cognition known as associative learning.
In experiments, the new synaptic transistor successfully recognized similar patterns, displaying its associative memory. Even when the researchers threw curve balls — like giving it incomplete models — they still managed to demonstrate associative learning.
“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam said. “Imagine if you are operating an autonomous vehicle and the weather conditions deteriorate. The vehicle may not be able to interpret more complex sensor data as well as a human driver. But even when we gave an imperfect input to our transistor, it could still identify the correct answer.
Funding: The study, “Moire synaptic transistor with neuromorphic functionality at room temperature,” was primarily supported by the National Science Foundation.
About this research news in neurotechnologies and AI
Author: Amanda Morris
Source: Northwestern University
Contact: Amanda Morris – Northwestern University
Picture: Image is credited to Neuroscience News
Original research: Closed access.
“Moire synaptic transistor with neuromorphic functionality at room temperature” by Mark C. Hersam et al. Nature
Moiré synaptic transistor with neuromorphic functionality at room temperature
Moiré quantum materials host exotic electronic phenomena through enhanced internal Coulomb interactions in twisted two-dimensional heterostructures. When combined with the exceptionally high electrostatic control in atomically thin materials, moiré heterostructures have the potential to enable next-generation electronic devices with unprecedented functionality.
However, despite extensive exploration, electronic moiré phenomena have so far been limited to impractical cryogenic temperatures, thus precluding real-world applications of moiré quantum materials.
Here we report the experimental realization and room temperature operation of a low-power (20 pW) moiré synaptic transistor based on an asymmetric bilayer graphene/hexagonal boron nitride moiré heterostructure. The asymmetric moiré potential gives rise to robust electronic ratchet states, which enable hysteretic and nonvolatile injection of charge carriers that control the conductance of the device.
Asymmetric firing in double-triggered moiré heterostructures realizes various biorealistic neuromorphic functionalities, such as reconfigurable synaptic responses, spatiotemporal tempotrons, and specific adaptation to Bienenstock–Cooper–Munro inputs.
In this way, the moiré synaptic transistor enables efficient in-memory computing designs and state-of-the-art hardware accelerators for artificial intelligence and machine learning.
Gn En tech