Header logo is

Parameter-Efficient Orthogonal Finetuning via Butterfly Factorization

2024

Conference Paper

ei

ps


Large foundation models are becoming ubiquitous, but training them from scratch is prohibitively expensive. Thus, efficiently adapting these powerful models to downstream tasks is increasingly important. In this paper, we study a principled finetuning paradigm -- Orthogonal Finetuning (OFT) -- for downstream task adaptation. Despite demonstrating good generalizability, OFT still uses a fairly large number of trainable parameters due to the high dimensionality of orthogonal matrices. To address this, we start by examining OFT from an information transmission perspective, and then identify a few key desiderata that enable better parameter-efficiency. Inspired by how the Cooley-Tukey fast Fourier transform algorithm enables efficient information transmission, we propose an efficient orthogonal parameterization using butterfly structures. We apply this parameterization to OFT, creating a novel parameter-efficient finetuning method, called Orthogonal Butterfly (BOFT). By subsuming OFT as a special case, BOFT introduces a generalized orthogonal finetuning framework. Finally, we conduct an extensive empirical study of adapting large vision transformers, large language models, and text-to-image diffusion models to various downstream tasks in vision and language.

Author(s): Weiyang Liu and Zeju Qiu and Yao Feng and Yuliang Xiu and Yuxuan Xue and Longhui Yu and Haiwen Feng and Zhen Liu and Juyeon Heo and Songyou Peng and Yandong Wen and Michael J. Black and Adrian Weller and Bernhard Schölkopf
Book Title: Proceedings of the Twelfth International Conference on Learning Representations (ICLR)
Year: 2024
Month: May

Department(s): Empirical Inference, Perceiving Systems
Bibtex Type: Conference Paper (inproceedings)
Paper Type: Conference

Event Name: The Twelfth International Conference on Learning Representations
Event Place: Vienna, Austria

State: Published
URL: https://openreview.net/forum?id=7NzgkEdGyr

Links: Home
Code
HuggingFace
project

BibTex

@inproceedings{boft,
  title = {Parameter-Efficient Orthogonal Finetuning via Butterfly Factorization},
  author = {Liu, Weiyang and Qiu, Zeju and Feng, Yao and Xiu, Yuliang and Xue, Yuxuan and Yu, Longhui and Feng, Haiwen and Liu, Zhen and Heo, Juyeon and Peng, Songyou and Wen, Yandong and Black, Michael J. and Weller, Adrian and Sch{\"o}lkopf, Bernhard},
  booktitle = {Proceedings of the Twelfth International Conference on Learning Representations (ICLR)},
  month = may,
  year = {2024},
  doi = {},
  url = {https://openreview.net/forum?id=7NzgkEdGyr},
  month_numeric = {5}
}