11/13/24 Series: "Unleashing Sparse Regularization: Equation for Setting Regularization Strength to Achieve Targeted Compression in CNNs and Transformers"

Join us on November 13th for another exciting talk in our Fall 2024 CTML Seminar Series! Gilmer Valdes’ talk “Unleashing Sparse Regularization: Equation for Setting Regularization Strength to Achieve Targeted Compression in CNNs and Transformers ” will take place at 11:00AM at Berkeley Way West, 5th Floor, Room 5401.


Setting the regularization strength parameter in neural networks is often done manually, requiring computationally heavy grid searches that limit practitioners to testing only a few values. This constraint hinders fully harnessing sparse regularization’s benefits, like model compression, automatic architecture search and enhanced interpretability. By introducing an equation to directly calculate the optimal regularization strength for a specified compression, Lockout eliminates the need for exhaustive parameter searches. This approach enables practitioners to achieve targeted sparsity efficiently, making sparse regularization more accessible and powerful. Extensive experimentation results in CNNs and Transformer will be provided.