Since the mid-1960s, the solid-state industry has been guided by Moore's law-forecasts made by the co-founder of microprocessor giant Intel, Gordon Moore, that ever-shrinking devices will result in enhanced computing performance and energy efficiency.

Essentially, as specified in a report published in Nature, solid-state computing has had a long run since the 1950s, when transistors started to replace vacuum tubes as the key element of electronic circuits.

Generations of new solid-state devices processing and storing information electronically at fast-tracked speeds came and went as germanium transistors were replaced by silicon transistors succeed by integrated circuits, then by increasingly complicated chips filled with even higher counts of tinier transistors.

Nanotechnology advances have enabled the tiniest features on the most advanced integrated circuits today to be shrunk to an atomic scale, although this is not compatible with present devices.

ALSO READ: Carbon Nanotubes Used by Researchers for Artificial Muscles

Moore’s Law
(Photo : Wikimedia Commons)
The first clues of a major change in computing occurred sometime in 2012, as Moore’s law started stalling out and developers of deep learning realized that genera-purpose central processing units or CPUs used in conventional computers could not meet such needs.

New Nanomaterials Needed

The next major step in computing not just necessitates new nanomaterials but needs a new architecture, as well.

Complementary metal-oxide-semiconductor or CMOS transistors, as detailed in a ScienceDirect report, have been the standard building blocks for integrated circuits since the 1980s.

Vitally, CMOS circuits, like generations of digital computers before them, depend on the fundamental architecture that John von Neumann opted for in the mid-20th century

His architecture was designed to split the electronics that store data in computers from those that are processing digital information.

Furthermore, the computer-stored information in a single place and then sent it to other circuits for processing. Separating stored memory from the processor is keeping the signals from interfering with each other and keeps the preciseness needed for digital computing.

Aim to Switch to Neuromorphic Systems

Another objective is to switch to neuromorphic systems, which are using algorithms and network designs that mimic the high connectivity and the human brain's parallel processing.

Meaning, developing new artificial neurons and synapses that are compatible with electronic processing, yet go beyond the performance of CMOS circuits, Mark Hersam, chemical and materials science researcher explained.

The researcher also said that it's not a small feat, although it would be well worth the cost. He added, that he is more interested in neuromorphic computing the in-memory processing, since he believes that mimicking the brain is a more massive paradigm shift, with more potential upsides.

The challenge in both circumstances is to identify the best technologies for the task, a project Hersam is continuing to work on at Northwestern University in Evanston, Illinois.

Headed for a Speedier Processing

The first clues of a major change in computing occurred sometime in 2012, as Moore's law, which is described on the Science Focus site, started stalling out and developers of deep learning, where systems enhance their performance based on previous experiences, realized that genera-purpose central processing units or CPUs used in conventional computers could not meet such needs.

The strength of CPUs was their versatility, explained Wilfred Haensch, who led the group developing ideas for computer memory at the IBM Watson Research Center in Yorktown Heights, New York until he retired in 2020.

He also said that whatever program one comes up with, the CPU can perform it. Whether it can conduct it efficiently is a different story.

Searching for better processors for deep learning, developers at IBM turned to graphical processing units or GPUs, designing to carry out advanced mathematics employed for high-speed, three-dimensional imaging applied in computer games.

Moore's Law is explained on CuriousReason's YouTube video below:

 

RELATED ARTICLE: Startup Blueshift Aerospace Makes History as the First Biofuel Rocket To Launch in Maine

Check out more news and information on Nanoparticles in Science Times.