Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_cv 70% Match Research Paper Machine learning researchers,Deep learning practitioners,Neuroscience researchers,AI theorists 17 hours ago

Training Convolutional Neural Networks with the Forward-Forward algorithm

large-language-models › training-methods
📄 Abstract

Abstract: Recent successes in image analysis with deep neural networks are achieved almost exclusively with Convolutional Neural Networks (CNNs), typically trained using the backpropagation (BP) algorithm. In a 2022 preprint, Geoffrey Hinton proposed the Forward-Forward (FF) algorithm as a biologically inspired alternative, where positive and negative examples are jointly presented to the network and training is guided by a locally defined goodness function. Here, we extend the FF paradigm to CNNs. We introduce two spatially extended labeling strategies, based on Fourier patterns and morphological transformations, that enable convolutional layers to access label information across all spatial positions. On CIFAR10, we show that deeper FF-trained CNNs can be optimized successfully and that morphology-based labels prevent shortcut solutions on dataset with more complex and fine features. On CIFAR100, carefully designed label sets scale effectively to 100 classes. Class Activation Maps reveal that FF-trained CNNs learn meaningful and complementary features across layers. Together, these results demonstrate that FF training is feasible beyond fully connected networks, provide new insights into its learning dynamics and stability, and highlight its potential for neuromorphic computing and biologically inspired learning.

Key Contributions

This paper extends the Forward-Forward (FF) algorithm, a biologically inspired alternative to backpropagation, to Convolutional Neural Networks (CNNs). It introduces novel spatially extended labeling strategies (using Fourier patterns and morphological transformations) that enable FF to train deeper CNNs successfully on datasets like CIFAR10 and CIFAR100.

Business Value

Could lead to new, potentially more efficient or biologically plausible training methods for deep learning models, impacting future AI development and hardware design.