Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

ubc-vision/StableKeypoints

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

231 Commits
231 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unsupervised Keypoints from Pretrained Diffusion Models (CVPR 2024 Highlight)

Eric Hedlin, Gopal Sharma, Shweta Mahajan, Xingzhe He, Hossam Isack, Abhishek Kar, Helge Rhodin, Andrea Tagliasacchi, Kwang Moo Yi

Project Page

For more detailed information, visit our project page or read our paper

Interactive Demo

We provide an interactive demo in a Google Colab. This allows a user to upload custom images and optimizes and visualizes the found keypoints over the images.

Requirements

Set up environment

Create a conda environment using the provided requirements.yaml:

conda env create -f requirements.yaml
conda activate StableKeypoints

Hugging Face Token

To download the pre-trained models from Hugging Face, you need a read-access token.

  1. Create an account on Hugging Face if you don't have one.
  2. Go to your account settings and create a new token with "read" permissions.
  3. When running the script, provide this token using the --my_token YOUR_TOKEN_HERE command-line argument.

Download datasets

The CelebA, Taichi, Human3.6m, DeepFashion, and CUB datasets can be found on their websites.

Preprocessed data for CelebA, and CUB can be found in Autolink's repository.

Usage

To use the code, run:

python3 -m unsupervised_keypoints.main [arguments]

Main Arguments

  • --dataset_loc: Path to the dataset.
  • --dataset_name: Name of the dataset.
  • --num_steps: Number of steps (default 500, up to 10,000 for non-human datasets).
  • --evaluation_method: Following baselines, the evaluation method varies by dataset:
    • CelebA: 'inter_eye_distance'
    • CUB: 'visible'
    • Taichi: 'mean_average_error' (renormalized per keypoint)
    • DeepFashion: 'pck'
    • Human3.6M: 'orientation_invariant'
  • --save_folder: Output save location (default "outputs" inside the repo).

Example Usage

python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name celeba_wild --evaluation_method inter_eye_distance --save_folder /path/to/save

If you want to use a custom dataset you can run

python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name custom

Precomputed tokens

We provide the precomputed tokens here

BibTeX

@article{hedlin2023keypoints,
  title={Unsupervised Keypoints from Pretrained Diffusion Models},
  author={Hedlin, Eric and Sharma, Gopal and Mahajan, Shweta and He, Xingzhe and Isack, Hossam and Rhodin, Abhishek Kar Helge and Tagliasacchi, Andrea and Yi, Kwang Moo},
  journal={arXiv preprint arXiv:2312.00065},
  year={2023}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Morty Proxy This is a proxified and sanitized view of the page, visit original site.