From 9b8bbb760224aed4e468363a3fc9ce43a388b6ee Mon Sep 17 00:00:00 2001 From: Artyom Gadetsky Date: Thu, 13 Jun 2024 23:03:05 +0200 Subject: [PATCH] Update README.md --- README.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/README.md b/README.md index d067d0f..2284817 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,23 @@ This repo contains the source code of 🐢 TURTLE, an unupervised learning algor [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-food-101)](https://paperswithcode.com/sota/image-clustering-on-food-101?p=let-go-of-your-labels-with-unsupervised-1) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-mnist)](https://paperswithcode.com/sota/image-clustering-on-mnist?p=let-go-of-your-labels-with-unsupervised-1) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-stl-10)](https://paperswithcode.com/sota/image-clustering-on-stl-10?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-birdsnap)](https://paperswithcode.com/sota/image-clustering-on-birdsnap?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-sun397)](https://paperswithcode.com/sota/image-clustering-on-sun397?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-stanford-cars)](https://paperswithcode.com/sota/image-clustering-on-stanford-cars?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-fgvc-aircraft)](https://paperswithcode.com/sota/image-clustering-on-fgvc-aircraft?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-oxford-iiit-pets)](https://paperswithcode.com/sota/image-clustering-on-oxford-iiit-pets?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-fer2013)](https://paperswithcode.com/sota/image-clustering-on-fer2013?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-eurosat)](https://paperswithcode.com/sota/image-clustering-on-eurosat?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-resisc45)](https://paperswithcode.com/sota/image-clustering-on-resisc45?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-gtsrb)](https://paperswithcode.com/sota/image-clustering-on-gtsrb?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-kitti)](https://paperswithcode.com/sota/image-clustering-on-kitti?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-country211)](https://paperswithcode.com/sota/image-clustering-on-country211?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-pcam)](https://paperswithcode.com/sota/image-clustering-on-pcam?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-ucf101)](https://paperswithcode.com/sota/image-clustering-on-ucf101?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-kinetics-700)](https://paperswithcode.com/sota/image-clustering-on-kinetics-700?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-clevr-counts)](https://paperswithcode.com/sota/image-clustering-on-clevr-counts?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-hateful-memes)](https://paperswithcode.com/sota/image-clustering-on-hateful-memes?p=let-go-of-your-labels-with-unsupervised-1) +[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/let-go-of-your-labels-with-unsupervised-1/image-clustering-on-rendered-sst2)](https://paperswithcode.com/sota/image-clustering-on-rendered-sst2?p=let-go-of-your-labels-with-unsupervised-1)
The question we aim to answer in our work is how to utilize representations from foundation models to solve a new task in a fully unsupervised manner. We introduce the problem setting of unsupervised transfer and highlight the key differences between unsupervised transfer and other types of transfer. Specifically, types of downstream transfer differ in the amount of available supervision. Given representation spaces of foundation models, (i) supervised transfer, represented as a linear probe, trains a linear classifier given labeled examples of a downstream dataset; (ii) zero-shot transfer assumes descriptions of the visual categories that appear in a downstream dataset are given, and employs them via text encoder to solve the task; and (iii) unsupervised transfer assumes the least amount of available supervision, i.e., only the number of categories is given, and aims to uncover the underlying human labeling of a dataset.