This repository contains the official implementation of CAMeLU, a novel unsupervised meta-learning method that exploits in-context learning to learn from an unlabeled dataset. CAMeLU has been accepted to ICLR2025!
Download pre-trained models:
- resnet50.pth is mandatory and must be put in directory
resources; - best_model.pth if you want to test performance and you have not trained the model yet.
Install packages in a new virtual env:
wheel- pytorch (version 2.1) with CUDA,
- torch-scatter:
pip install torch-scatter -f https://data.pyg.org/whl/torch-2.1.0+${CUDA}.html
replace ${CUDA} with your cuda driver version (e.g. cpu, cu118, or cu121).
Finally, install all the remaining packages in requirements.txt with:
pip install -r requirements.txtThis code has been tested with python 3.10.x and python 3.11.x
The main.py files contains both the training and test phase, so run that file for both the training and inference phase, after changing the configurations file at config/config.json as shown later. The following sections will explain in details how to manage the respective configurations. Please notice that the parameters listed below come from a larger library and have no use in here:
dataset.crop_sizedataset.augment_offlinetrain_test.batch_size(computed according to N/k_shot/k_query)train_test.optimizer(hardcoded)train_test.weigth_decay(hardcoded, whenever required)
As far as experiment_name is concerned, keep its value to "disabled" if you do not want to use wandb; change it in any other way to enable wandb logging.
An example of training configuration can be found at unit_test/config_train.json, the only parameters that are likely to be changed are the path to the dataset and, in case, the dataset type. After the training has been completed, the best model is found under the directory output and it will be automatically executed in test.
Requirements
dataset.augment_times==model.context.k_shot(usually either1or5, as int), the program raises an exception (with instruction on how to fix) if this is not respected;dataset.augment_online==["support", "query"](as list of strings) to enable augmentations; the supervised counterpart is run otherwise!
If you already have a pre-trained model and you only want to perform inference, an example of configuration can be found at unit_test/config_test.json.
Requirements
dataset.augment_times==nulldataset.augment_online==nulltrain_test.model_test_path=="/path/to/model/file.pth"
This code has been tested with miniimagenet, CIFAR-FS, CUB, Aircraft, Meta-iNat and trained on imagenet-964 (named episodic_imagenet). More information on how to recreate these four datasets can be found in the respective python file (.py) under lib/libdataset/src/datasets/fsl/<dataset_name>.py and in lib/libdataset/src/datasets/dataset_utils.py
@inproceedings{camelu,
title={Unsupervised Meta-Learning via In-Context Learning},
author={Vettoruzzo, Anna and Braccaioli, Lorenzo and Vanschoren, Joaquin and Nowaczyk, Marlena},
booktitle={International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=Jprs1v2wPA}
}