A new research study published in Scientific Reports examines the possibility of achieving efficient learning for complex classification tasks using brain-inspired shallow feedforward networks. The goal is to determine if these networks could potentially require less computational complexity than the deep learning architectures currently in use. The study was led by Prof. Ido Kanter from Bar-Ilan’s Department of Physics and Gonda Multidisciplinary Brain Research Center.
The study sought to answer whether brain-inspired shallow learning can achieve the same classification success rates as deep learning architectures that consist of multiple layers and filters, but with less computational complexity. The findings showed that efficient learning on shallow architectures can indeed achieve similar success rates as deep learning, but it requires a shift in the properties of advanced GPU technology and dedicated hardware development.
The research team also found that efficient learning on brain-inspired shallow architectures is linked to efficient dendritic tree learning, which is based on previous research by Prof. Kanter on sub-dendritic adaptation using neuronal cultures, among other anisotropic properties of neurons.
Traditionally, brain dynamics and machine learning development have been studied separately. However, recent research has revealed that brain dynamics can be a source for new types of efficient artificial intelligence. If brain-inspired shallow learning can indeed achieve comparable results to deep learning with less computational complexity, it could lead to the development of unique hardware for more efficient and fast implementation of shallow learning.
Source: Bar-Ilan University