Skip to content

Selective Aggregation for Low-Rank Adaptation in Federated Learning

Notifications You must be signed in to change notification settings

Pengxin-Guo/FedSA-LoRA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

FedSA-LoRA

This is the offcial PyTorch implementation of Federated Share-A Low-Rank Adaptation (FedSA-LoRA).

Installation

Our code is based on Python version 3.10 and PyTorch version 2.1.0. You can install all the dependencies with the following command:

conda create -n fedsa-lora python=3.10
conda activate fedsa-lora
conda install pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 pytorch-cuda=12.1 -c pytorch -c nvidia
pip install -e .[llm]

Training

Now, we can fine-tune a LLM with FedSA-LoRA:

python federatedscope/main.py --cfg federatedscope/glue/yamls/fedsa-lora.yaml

Acknowledgement

We would like to thank the authors for releasing the public repository: FederatedScope-LLM.

About

Selective Aggregation for Low-Rank Adaptation in Federated Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published