The Technology
Training neural-networks is a highly demanding process, with respect to computational resources and time. Moreover, updating trained networks in deployment with new data is costly and may reduce the overall performance of the network (e.g. due to “catastrophic forgetting”). As the parameters of neural models grow fast, training and updating networks become a crucial bottleneck creating the need for financially and time-efficient solution.
Our innovative model Correlation Mode Decomposition (CMD) is a new concept for the training dynamics of neural networks. It builds on the very high correlation of the network parameters exhibited during training. Enabling a network containing millions of parameters to be represented by only 10 or less modes.
Advantages
- More efficient training
- Less computations
- Allows fast updates of new data
- Backed by theory, analysis and extensive experiments
- Can cope with variety of architectures
Applications and Opportunities
- Patent pending technology
- Can be applied to vision applications, natural language processing (NLP) and so on
- Can be integrated within a software package or a specialized hardware
- Recently presented on ICLR 2024 (Int conf. on Learning Representation)