As computers have become bigger and more efficient and parallel computing and supercomputers are the norm, we will attain a wall of electricity and miniaturization. Penn State scientists have created a 2D machine that is more cerebral than computer architecture and can provide more than just yes or no responses.
‘The complexity scale is also decreasing owing to the non-scalability of von Neumann’s standard IT framework and the imminent’ dark silicon era ‘ which poses a severe threat to multi-core chip technology.
The era of silicon relates to being fed at the same moment and to some extent. This is because of the heat. The Von Neumann architecture is the normal structure of most modern computers and is based on an electronic approach – “yes” or “no” – in which program data and instructions are stored in the same memory and share the same information and the same channel of communication.
“For this reason, data operations and instructional acquisition can not be done exactly at the same time,” said Saptarshi Das, an assistant professor of engineering science and engineering. “For complex decision-making using neural networks, you may need a group of supercomputers looking to use parallel processors at the same time – even a million notebooks in parallel – that would occupy a football field. Portable health devices, for instance, can not work this way. ”
What About The Solution?
The solution, based on Das’ logic, would be to create brain-inspired statistical neural networks which doesn’t rely on devices which are simply on or off, but offer a range of responses that are probabilistic and are subsequently compared with the database from the machine. To do this, the researchers created a Gaussian transistor that is made from 2D substances — phosphorus and molybdenum disulfide. These devices are more energy efficient and produce more heat, making them perfect for scaling up systems.
“The human mind works perfectly with 20 watts of power,” said Das. “It consumes over 100 billion neurons and does not use von Neumann’s architecture.”
Researchers notice that it is becoming more hard to match in larger rooms, although it is not just heat and electricity that have become a problem. “Size scaling has stopped,” Das said. “We can fit just about 1 billion transistors on a chip. We need more sophistication like the mind.”
The idea of neural networks has been around since the 1980s, but it has had equipment that are particular for application.
“Similar to the working of a human mind, key features are derived from a set of training samples to help the neural system understand,” explained Amritanand Sebastian, grad student in engineering science and mechanics. The researchers analyzed their neural system graphical representation of brain waves , on electroencephalography. After feeding on the system with cases of EEGs, the system could take a fresh EEG signal and examine it and find out whether the issue is solved.
“We don’t need such a training period or a database to get a probabilistic neural network, because we want to get a network of artificial neurons,” Das said.
Researchers regard the calculation of the neural network as findings are not necessarily 100 percent yes or no, using apps in medicine. They also acknowledge that the best possible use of minimal energy and medical devices needs to be tiny and portable.
Das and peers call their device Gaussian synapse and it is based on a two-transistor configuration in which the molybdenum disulfide is an electron conductor, while the phosphorus flows through electrons or holes. The unit consists of two variable resistors in sequence and the combination also generates a graph with two tails, which corresponds to the Gaussian function.
Andrew Pannone and Shiva Subbulakshmi, an engineering student at Amrita Vishwa Vidyapeetham, India, and a summer intern at Das Labor were also involved in this work.
This work also has been verified by the Scientific Research Bureau of the Air Force.