An MIT doctoral candidate, Miranda Schwacke, is developing neuromorphic devices that mimic brain operations to make AI more energy-efficient, addressing the massive power consumption of large AI models and aiming to reduce environmental impact.
Researchers at the University of Surrey have developed a brain-inspired approach called Topographical Sparse Mapping to enhance AI performance and efficiency by mimicking the human brain's neural wiring, reducing energy consumption and improving sustainability in AI models.
Chinese researchers have developed Darwin Monkey, a brain-inspired supercomputer with over 2 billion artificial neurons, mimicking mammalian neural structures to advance AI and neuroscience research, while consuming only 2,000 watts of power.
Researchers at Bar-Ilan University have shown that efficient learning on an artificial shallow architecture can achieve the same classification success rates as deep learning architectures, but with less computational complexity. This raises the question of whether deep learning is necessary for artificial intelligence. The efficient realization of shallow architectures requires a shift in the properties of advanced GPU technology and future dedicated hardware developments. Brain-inspired shallow learning has advanced computational capability with reduced complexity and energy consumption.