Skip to content

Commit

Permalink
Add the implementation of FedSP (#709)
Browse files Browse the repository at this point in the history
  • Loading branch information
xieyxclack authored Oct 20, 2023
1 parent 06e68f5 commit f8ac88c
Show file tree
Hide file tree
Showing 49 changed files with 3,339 additions and 52 deletions.
2 changes: 0 additions & 2 deletions .github/workflows/test_atc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ on:
schedule:
- cron: '0 8 * * 0'


jobs:
run:
if: (false == contains(github.event.pull_request.title, 'WIP') && github.repository == 'alibaba/FederatedScope')
Expand Down Expand Up @@ -98,4 +97,3 @@ jobs:
data.root test_data/ \
[ $? -eq 1 ] && exit 1 || echo "Passed"
74 changes: 74 additions & 0 deletions .github/workflows/test_fedsp.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
name: UnitTests for FedSP

on:
schedule:
- cron: '0 8 * * 0'

jobs:
run:
if: (false == contains(github.event.pull_request.title, 'WIP') && github.repository == 'alibaba/FederatedScope')
runs-on: ${{ matrix.os }}
timeout-minutes: 30
strategy:
matrix:
os: [ubuntu-latest]
python-version: ['3.9']
torch-version: ['1.10.1']
torchvision-version: ['0.11.2']
torchaudio-version: ['0.10.1']
env:
OS: ${{ matrix.os }}
PYTHON: '3.9'
steps:
- uses: actions/checkout@master
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@master
with:
python-version: ${{ matrix.python-version }}
- name: Install PyTorch ${{ matrix.torch-version }}+cpu
run: |
pip install numpy typing-extensions dataclasses
pip install torch==${{ matrix.torch-version}}+cpu torchvision==${{matrix.torchvision-version}}+cpu torchaudio==${{matrix.torchaudio-version}}+cpu -f https://download.pytorch.org/whl/torch_stable.html
- name: Install FS
run: |
pip install -e .[test]
- name: Install Transformers
run: |
pip install transformers==4.21.0
- name: Install Datasets
run: |
pip install datasets
- name: Install lm-eval
run: |
pip install lm-eval
- name: Test Prompt Tuning
run: |
python ../../main.py \
--cfg federatedscope/nlp/fedsp/baseline/config_alter_train.yaml \
data.dataset_name arc_challenge \
data.batch_size 1 \
data.max_seq_len 32 \
grad.grad_accum_count 1 \
federate.client_num 2 \
federate.total_round_num 2 \
federate.make_global_train True \
federate.pl_init_kd True \
federate.pl_kd_cfg_file federatedscope/nlp/fedsp/baseline/config_init_kd_test.yaml \
federate.pl_global_cfg_file federatedscope/nlp/fedsp/baseline/config_global.2.yaml \
model.use_fp16 True \
model.model_type facebook/opt-1.3b \
model.use_prefix_prj False \
model.server_prefix_len 4 \
model.client_prefix_len 4 \
model.num_server_layers 24 \
model.num_client_layers 24 \
model.share_client_layer_param True \
model.client_start_layer_id 0 \
model.num_client_layers_per_cell 1 \
train.optimizer.lr 5e-4 \
train.optimizer.eps 1e-4 \
train.local_update_steps 2 \
outdir exp/arc_challenge \
data.is_debug True \
[ $? -eq 1 ] && exit 1 || echo "Passed"
43 changes: 43 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -661,3 +661,46 @@ distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

--------------------------------------------------------------------------------

The code in federatedscope/nlp/fedsp/dataset and federatedscope/nlp/fedsp/metric
is partially adapted from https://github.com/mit-han-lab/offsite-tuning (MIT License)

Copyright (c) 2023 MIT HAN Lab

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

---------------------------------------------------------------------------------
The code in federatedscope/nlp/fedsp/model and federatedscope/nlp/fedsp/worker/server.py
is partially adapted from https://github.com/THUDM/P-tuning-v2 (Apache License)

Copyright 2021 Xiao Liu

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
2 changes: 1 addition & 1 deletion federatedscope/core/auxiliaries/data_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
'dblp_org', 'csbm.*?', 'fb15k-237', 'wn18', 'adult', 'abalone',
'credit', 'blog'
], # Dummy for FL dataset
'RawDataTranslator': ['hetero_nlp_tasks'],
'RawDataTranslator': ['hetero_nlp_tasks', 'fedsp_data'],
}
DATA_TRANS_MAP = RegexInverseMap(TRANS_DATA_MAP, None)

Expand Down
2 changes: 1 addition & 1 deletion federatedscope/core/auxiliaries/logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ def update_logger(cfg, clear_before_add=False):
"sub_exp" + datetime.now().strftime('_%Y%m%d%H%M%S'))
cfg.outdir = outdir
# if not, make directory with given name
os.makedirs(cfg.outdir)
os.makedirs(cfg.outdir, exist_ok=True)

# create file handler which logs even debug messages
fh = logging.FileHandler(os.path.join(cfg.outdir, 'exp_print.log'))
Expand Down
2 changes: 1 addition & 1 deletion federatedscope/core/auxiliaries/metric_builder.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
import logging
import federatedscope.register as register
from federatedscope.nlp.hetero_tasks.metric import *

logger = logging.getLogger(__name__)

try:
from federatedscope.contrib.metrics import *
from federatedscope.nlp.hetero_tasks.metric import *
except ImportError as error:
logger.warning(
f'{error} in `federatedscope.contrib.metrics`, some modules are not '
Expand Down
5 changes: 4 additions & 1 deletion federatedscope/core/auxiliaries/model_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def get_shape_from_data(data, model_config, backend='torch'):
return shape


def get_model(model_config, local_data=None, backend='torch'):
def get_model(model_config, local_data=None, backend='torch', role='client'):
"""
This function builds an instance of model to be trained.
Expand Down Expand Up @@ -197,6 +197,9 @@ def get_model(model_config, local_data=None, backend='torch'):
elif model_config.type.lower() in ['atc_model']:
from federatedscope.nlp.hetero_tasks.model import ATCModel
model = ATCModel(model_config)
elif model_config.type.lower() in ['fedsp_model']:
from federatedscope.nlp.fedsp.model import FedSPModel
model = FedSPModel(model_config, role=role)
else:
raise ValueError('Model {} is not provided'.format(model_config.type))

Expand Down
57 changes: 30 additions & 27 deletions federatedscope/core/auxiliaries/trainer_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@
from federatedscope.contrib.trainer import *
except ImportError as error:
logger.warning(
f'{error} in `federatedscope.contrib.trainer`, some modules are not '
f'available.')
f"{error} in `federatedscope.contrib.trainer`, some modules are not "
f"available.")

TRAINER_CLASS_DICT = {
"cvtrainer": "CVTrainer",
Expand All @@ -29,6 +29,7 @@
"cltrainer": "CLTrainer",
"lptrainer": "LPTrainer",
"atc_trainer": "ATCTrainer",
"fedsp_trainer": "FedSPTrainer",
}


Expand Down Expand Up @@ -108,16 +109,16 @@ def get_trainer(model=None,
``attack.auxiliary.attack_trainer_builder.wrap_attacker_trainer``
================================== ===========================
"""
if config.trainer.type == 'general':
if config.backend == 'torch':
if config.trainer.type == "general":
if config.backend == "torch":
from federatedscope.core.trainers import GeneralTorchTrainer
trainer = GeneralTorchTrainer(model=model,
data=data,
device=device,
config=config,
only_for_eval=only_for_eval,
monitor=monitor)
elif config.backend == 'tensorflow':
elif config.backend == "tensorflow":
from federatedscope.core.trainers import GeneralTFTrainer
trainer = GeneralTFTrainer(model=model,
data=data,
Expand All @@ -127,36 +128,38 @@ def get_trainer(model=None,
monitor=monitor)
else:
raise ValueError
elif config.trainer.type == 'none':
elif config.trainer.type == "none":
return None
elif config.trainer.type.lower() in TRAINER_CLASS_DICT:
if config.trainer.type.lower() in ['cvtrainer']:
if config.trainer.type.lower() in ["cvtrainer"]:
dict_path = "federatedscope.cv.trainer.trainer"
elif config.trainer.type.lower() in ['nlptrainer']:
elif config.trainer.type.lower() in ["nlptrainer"]:
dict_path = "federatedscope.nlp.trainer.trainer"
elif config.trainer.type.lower() in ['cltrainer', 'lptrainer']:
elif config.trainer.type.lower() in ["cltrainer", "lptrainer"]:
dict_path = "federatedscope.cl.trainer.trainer"
elif config.trainer.type.lower() in [
'graphminibatch_trainer',
"graphminibatch_trainer",
]:
dict_path = "federatedscope.gfl.trainer.graphtrainer"
elif config.trainer.type.lower() in [
'linkfullbatch_trainer', 'linkminibatch_trainer'
"linkfullbatch_trainer", "linkminibatch_trainer"
]:
dict_path = "federatedscope.gfl.trainer.linktrainer"
elif config.trainer.type.lower() in [
'nodefullbatch_trainer', 'nodeminibatch_trainer'
"nodefullbatch_trainer", "nodeminibatch_trainer"
]:
dict_path = "federatedscope.gfl.trainer.nodetrainer"
elif config.trainer.type.lower() in [
'flitplustrainer', 'flittrainer', 'fedvattrainer',
'fedfocaltrainer'
"flitplustrainer", "flittrainer", "fedvattrainer",
"fedfocaltrainer"
]:
dict_path = "federatedscope.gfl.flitplus.trainer"
elif config.trainer.type.lower() in ['mftrainer']:
elif config.trainer.type.lower() in ["mftrainer"]:
dict_path = "federatedscope.mf.trainer.trainer"
elif config.trainer.type.lower() in ['atc_trainer']:
dict_path = "federatedscope.nlp.hetero_tasks.trainer"
elif config.trainer.type.lower() in ["fedsp_trainer"]:
dict_path = "federatedscope.nlp.prompt_tuning.trainer"
else:
raise ValueError

Expand Down Expand Up @@ -189,15 +192,15 @@ def get_trainer(model=None,
only_for_eval=only_for_eval,
monitor=monitor)
if trainer is None:
raise ValueError('Trainer {} is not provided'.format(
raise ValueError("Trainer {} is not provided".format(
config.trainer.type))

if not isinstance(trainer, Trainer):
logger.warning(f'Hook-like plug-in functions cannot be enabled when '
f'using {trainer}. If you want use our wrapper '
f'functions for your trainer please consider '
f'inheriting from '
f'`federatedscope.core.trainers.Trainer` instead.')
logger.warning(f"Hook-like plug-in functions cannot be enabled when "
f"using {trainer}. If you want use our wrapper "
f"functions for your trainer please consider "
f"inheriting from "
f"`federatedscope.core.trainers.Trainer` instead.")
return trainer

# differential privacy plug-in
Expand Down Expand Up @@ -228,22 +231,22 @@ def get_trainer(model=None,
trainer = wrap_FedRepTrainer(trainer)

# attacker plug-in
if 'backdoor' in config.attack.attack_method:
if "backdoor" in config.attack.attack_method:
from federatedscope.attack.trainer import wrap_benignTrainer
trainer = wrap_benignTrainer(trainer)

if is_attacker:
if 'backdoor' in config.attack.attack_method:
logger.info('--------This client is a backdoor attacker --------')
if "backdoor" in config.attack.attack_method:
logger.info("--------This client is a backdoor attacker --------")
else:
logger.info('-------- This client is an privacy attacker --------')
logger.info("-------- This client is an privacy attacker --------")
from federatedscope.attack.auxiliary.attack_trainer_builder \
import wrap_attacker_trainer
trainer = wrap_attacker_trainer(trainer, config)

elif 'backdoor' in config.attack.attack_method:
elif "backdoor" in config.attack.attack_method:
logger.info(
'----- This client is a benign client for backdoor attacks -----')
"----- This client is a benign client for backdoor attacks -----")

# fed algorithm plug-in
if config.fedprox.use:
Expand Down
12 changes: 12 additions & 0 deletions federatedscope/core/auxiliaries/worker_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,10 @@ def get_client_cls(cfg):
from federatedscope.nlp.hetero_tasks.worker import ATCClient
return ATCClient

if cfg.trainer.type.lower() == 'fedsp_trainer':
from federatedscope.nlp.fedsp.worker import FedSPClient
return FedSPClient

if cfg.federate.method.lower() in constants.CLIENTS_TYPE:
client_type = constants.CLIENTS_TYPE[cfg.federate.method.lower()]
else:
Expand Down Expand Up @@ -187,6 +191,14 @@ def get_server_cls(cfg):
from federatedscope.nlp.hetero_tasks.worker import ATCServer
return ATCServer

if cfg.data.type.lower() == 'hetero_nlp_tasks':
from federatedscope.nlp.hetero_tasks.worker import ATCServer
return ATCServer

if cfg.trainer.type.lower() == 'fedsp_trainer':
from federatedscope.nlp.fedsp.worker import FedSPServer
return FedSPServer

if cfg.federate.method.lower() in constants.SERVER_TYPE:
server_type = constants.SERVER_TYPE[cfg.federate.method.lower()]
else:
Expand Down
5 changes: 5 additions & 0 deletions federatedscope/core/configs/cfg_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,11 @@ def extend_data_cfg(cfg):

cfg.feat_engr.secure.dp = CN() # under dev

# prompt tuning
cfg.data.dataset_name = '' # TODO
cfg.data.train_frac = 0.9
cfg.data.num_train_per_client = -1

# --------------- outdated configs ---------------
# TODO: delete this code block
cfg.data.loader = ''
Expand Down
4 changes: 4 additions & 0 deletions federatedscope/core/configs/cfg_fl_algo.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,10 @@ def extend_fl_algo_cfg(cfg):
cfg.personalization.epoch_linear = 2 # training epoch number
cfg.personalization.weight_decay = 0.0

# prompt tuning
cfg.personalization.server_local_param = []
cfg.personalization.client_local_param = []

# ---------------------------------------------------------------------- #
# FedSage+ related options, gfl
# ---------------------------------------------------------------------- #
Expand Down
12 changes: 12 additions & 0 deletions federatedscope/core/configs/cfg_fl_setting.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ def extend_fl_setting_cfg(cfg):
cfg.federate.data_weighted_aggr = False # If True, the weight of aggr is
# the number of training samples in dataset.
cfg.federate.online_aggr = False
cfg.federate.make_global_train = False
cfg.federate.make_global_eval = False
cfg.federate.use_diff = False
cfg.federate.merge_test_data = False # For efficient simulation, users
Expand Down Expand Up @@ -54,6 +55,17 @@ def extend_fl_setting_cfg(cfg):
cfg.federate.atc_vanilla = False
cfg.federate.atc_load_from = ''

# prompt tuning
cfg.federate.skip_local_train = False
cfg.federate.ckpt_path = ''
cfg.federate.pl_save_to = ''
cfg.federate.pl_ret_avg_model = False
cfg.federate.pl_alter_train = False # alternately train model and prompt
# in each client
cfg.federate.pl_init_kd = False
cfg.federate.pl_kd_cfg_file = ''
cfg.federate.pl_global_cfg_file = ''

# ---------------------------------------------------------------------- #
# Distribute training related options
# ---------------------------------------------------------------------- #
Expand Down
Loading

0 comments on commit f8ac88c

Please sign in to comment.