Skip to content

Unsupervised meta-learner accepted at ICLR2025

Notifications You must be signed in to change notification settings

bracca95/CAMeLU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAMeLU

This repository contains the official implementation of CAMeLU, a novel unsupervised meta-learning method that exploits in-context learning to learn from an unlabeled dataset. CAMeLU has been accepted to ICLR2025!

Preliminaries

Download pre-trained models:

  • resnet50.pth is mandatory and must be put in directory resources;
  • best_model.pth if you want to test performance and you have not trained the model yet.

Install packages in a new virtual env:

replace ${CUDA} with your cuda driver version (e.g. cpu, cu118, or cu121).

Finally, install all the remaining packages in requirements.txt with:

pip install -r requirements.txt

This code has been tested with python 3.10.x and python 3.11.x

Run

The main.py files contains both the training and test phase, so run that file for both the training and inference phase, after changing the configurations file at config/config.json as shown later. The following sections will explain in details how to manage the respective configurations. Please notice that the parameters listed below come from a larger library and have no use in here:

  • dataset.crop_size
  • dataset.augment_offline
  • train_test.batch_size (computed according to N/k_shot/k_query)
  • train_test.optimizer (hardcoded)
  • train_test.weigth_decay (hardcoded, whenever required)

As far as experiment_name is concerned, keep its value to "disabled" if you do not want to use wandb; change it in any other way to enable wandb logging.

Train

An example of training configuration can be found at unit_test/config_train.json, the only parameters that are likely to be changed are the path to the dataset and, in case, the dataset type. After the training has been completed, the best model is found under the directory output and it will be automatically executed in test.

Requirements

  • dataset.augment_times == model.context.k_shot (usually either 1 or 5, as int), the program raises an exception (with instruction on how to fix) if this is not respected;
  • dataset.augment_online == ["support", "query"] (as list of strings) to enable augmentations; the supervised counterpart is run otherwise!

Test

If you already have a pre-trained model and you only want to perform inference, an example of configuration can be found at unit_test/config_test.json.

Requirements

  • dataset.augment_times == null
  • dataset.augment_online == null
  • train_test.model_test_path == "/path/to/model/file.pth"

Additional info

This code has been tested with miniimagenet, CIFAR-FS, CUB, Aircraft, Meta-iNat and trained on imagenet-964 (named episodic_imagenet). More information on how to recreate these four datasets can be found in the respective python file (.py) under lib/libdataset/src/datasets/fsl/<dataset_name>.py and in lib/libdataset/src/datasets/dataset_utils.py

Citation

@inproceedings{camelu,
  title={Unsupervised Meta-Learning via In-Context Learning},
  author={Vettoruzzo, Anna and Braccaioli, Lorenzo and Vanschoren, Joaquin and Nowaczyk, Marlena},
  booktitle={International Conference on Learning Representations},
  year={2025},
url={https://openreview.net/forum?id=Jprs1v2wPA}
}

About

Unsupervised meta-learner accepted at ICLR2025

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages