“Less is more”, once the foundational motto of minimalist art, is making its way into artificial intelligence. After a maximalist decade of larger computers training larger neural networks on larger datasets (2012-2022), a countertrend arises. What if human-level performance could be achieved with less computing, less memory, and less supervision?
In deep learning, the research prospect of doing more with less proceeds from a double motivation: to save energy and to save human effort. In this context, the MuReNN project imagines a “less is more” approach to AI systems, specifically to one of its most fundamental constituents: the deep convolutional network, or convnet for short. The overarching goal of MuReNN is to improve both the energy efficiency and annotation efficiency of convnets without compromising their ability for statistical generalization in high dimension.