You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
mohalisad 6400586fcc init 3 months ago
01_ResidualPrompt init 3 months ago
02_AutoEncoder init 3 months ago
03_CombPrompts init 3 months ago
04_LowerDimPrompt init 3 months ago
06_PCAEmb init 3 months ago
07_AnalyzeCombPrompts init 3 months ago
08_ICLR init 3 months ago
09_Cluster init 3 months ago
11_wandb_api init 3 months ago
13_additional_table init 3 months ago
14_thesis_run init 3 months ago
_datasets init 3 months ago
_models init 3 months ago
_mydelta init 3 months ago
_trainer init 3 months ago
.gitignore init 3 months ago
README.md init 3 months ago
Untitled.ipynb init 3 months ago
_config.py init 3 months ago
_utils.py init 3 months ago
requirements.txt init 3 months ago

README.md

Project README

This project is based on Python 3.10. To get started, you can create an environment using conda with the following command:

conda create -n superpos python=3.10

After setting up the environment, install all the required packages with:

pip install -r requirements.txt

Project Structure

The entry point of this project is located in the ./09_Cluster directory. The most important files in this directory are the config.yaml files. Below is an example of a configuration file:

default: &default
  use_tqdm: true 
  random_seed: 42
  base_save_path: /home/msadraei/trained_final
  model_name: google/t5-base-lm-adapt
  project_name_prefix: iclr_attempt_lmt5
  experiment_name_suffix: null
  train_batch_size: 32
  valid_batch_size: 32
  remove_dropout: true
  learning_rate: 0.01
  weight_decay: 0.01
  num_epochs: 40
  peft_params: null  # no mutation
  hot_modules:
  - sadcl
  best_finder:
    save: true
    metric: valid_mean
    higher_better: true
  tasks:
  - glue:cola
  - glue:mrpc
  - glue:stsb
  - superglue:rte
  - superglue:cb
  - superglue:wic
  - superglue:copa
  - superglue:boolq
  - superglue:multirc
  
pp: &pp
  - /home/msadraei/trained_final/hzi_cluster_t5_base_glue-mnli/10_combine_128
  - /home/msadraei/trained_final/hzi_cluster_t5_base_glue-sst2/10_combine_128
  - /home/msadraei/trained_final/hzi_cluster_t5_base_glue-qqp/10_combine_128
  - /home/msadraei/trained_final/hzi_cluster_t5_base_glue-qnli/10_combine_128

run_configs:
- <<: *default
  learning_rate: 0.3
  weight_decay: 0.00001
  peft_params:
    kind: attempt
    n_tokens: 10
    g_bottleneck: 100
    pretrained_paths: *pp

PEFT Support

This project supports different kinds of Parameter-Efficient Fine-Tuning (PEFT) methods. The valid values for PEFT types are 'combine', 'residual', 'simple', 'spot', and 'attempt'. Each run configuration will be executed over each dataset in the list of tasks.

Running the Project

To run a configuration, use the following command:

python train.py config.yaml

This will start the training process based on the settings defined in config.yaml.