Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 85% Match Research Paper Researchers in AI,Neuroscientists,Information Theorists 20 hours ago

Redundancy Maximization as a Principle of Associative Memory Learning

graph-neural-networks › graph-learning
📄 Abstract

Abstract: Associative memory, traditionally modeled by Hopfield networks, enables the retrieval of previously stored patterns from partial or noisy cues. Yet, the local computational principles which are required to enable this function remain incompletely understood. To formally characterize the local information processing in such systems, we employ a recent extension of information theory - Partial Information Decomposition (PID). PID decomposes the contribution of different inputs to an output into unique information from each input, redundant information across inputs, and synergistic information that emerges from combining different inputs. Applying this framework to individual neurons in classical Hopfield networks we find that below the memory capacity, the information in a neuron's activity is characterized by high redundancy between the external pattern input and the internal recurrent input, while synergy and unique information are close to zero until the memory capacity is surpassed and performance drops steeply. Inspired by this observation, we use redundancy as an information-theoretic learning goal, which is directly optimized for each neuron, dramatically increasing the network's memory capacity to 1.59, a more than tenfold improvement over the 0.14 capacity of classical Hopfield networks and even outperforming recent state-of-the-art implementations of Hopfield networks. Ultimately, this work establishes redundancy maximization as a new design principle for associative memories and opens pathways for new associative memory models based on information-theoretic goals.

Key Contributions

This paper introduces Partial Information Decomposition (PID) as a novel framework to analyze the local computational principles in associative memory systems like Hopfield networks. It demonstrates that redundancy maximization between external and internal inputs is a key principle for learning in these networks, providing a deeper understanding of their functionality below memory capacity.

Business Value

Understanding fundamental principles of memory and information processing can lead to more efficient and robust AI systems for tasks requiring memory and pattern recognition.