The human brain can process multiple types of input simultaneously, such as sight, sound, smell, and touch. On the other hand, most artificial intelligence (AI) may come a long way, but much still needs to be done to make them truly intelligent.

Robots tend to follow more straightforward sensory input, but a team of researchers has utilized the biological concept for AI application in developing the first artificial, multi-sensory integrated neuron.

Natural vs. Artificial Decision-Making Skills

Robots make decisions depending on their environment without their sensors talking to each other. According to Saptarshi Das, associate professor of engineering science and mechanics at Pennsylvania State University, a collective decision can be made using a sensor processing unit. Still, it could not be the most efficient or effective strategy. In the human brain, a sense can influence another sense to make the person better judge a situation.

For instance, a car might be equipped with a sensor that scans obstacles, while a separate one is used to sense darkness to modulate the intensity of the headlights. As these sensors work individually, they relay the information to a central unit, which then commands the car to brake or adjust the headlight. However, this process consumes a lot of energy. Regarding energy and speed, it could be more efficient if the sensors can communicate directly, especially if the inputs from both sensors are faint.

In the natural environment, organisms thrive in settings with limited resources, so they have to minimize energy consumption. If this is applied to robots, the requirements for various sensors are based on the context. In a dark forest, for example, an organism will depend more on its sense of hearing than on its sense of sight. As an organism understands its surroundings, its decision-making is not based on just one sense. Instead, it is based on integrating sight, hearing, touch, and smell. All senses evolved together in biological systems but separately in artificial intelligence.

READ ALSO: Artificial Neurons Can Store 'Electronic Memories,' Will It Advance Elon Musk's Neuralink Technology?

First Artificial Multisensory Integrated Neuron

To solve this challenge, Penn State researchers tried to combine sensors and mimic the mechanism in which the human brain works. Led by Das, the team focused on a visual sensor and a tactile sensor in a way that the output of one sensor modifies the other through the help of visual memory.

Next, a multi-sensory neuron was fabricated by connecting a tactile sensor to a phototransistor based on a monolayer of molybdenum disulfide. This compound exhibits unique electrical and optical properties that can be used to detect light and support transistors. As the sensor generates electrical spikes like the way neurons process information, it allows visual and tactile cues to be integrated.

The researchers simulated the touch input using the triboelectric effect on the tactile sensor. Two layers slide against each other to produce electricity, encoding the touch stimuli into electrical impulses. Meanwhile, the visual input was simulated using a monolayer molybdenum disulfide photo mem transistor.

According to the research team, the artificial multi-sensory neuron system can be useful in enhancing the efficiency of sensor technology, paving the way for more eco-friendly uses of AI. As a result, it can allow robots, self-driving cars, and drones to effectively navigate their environment without using too much energy.

RELATED ARTICLE: Novel Artificial Organic Neuron Effectively Mimics Natural Nerve Cells, Making It a Promising Technology for Medical Treatments

Check out more news and information on Neuron in Science Times.