Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Minimalistic TensorFlow2+ deep metric/similarity learning library with loss functions, miners, and utils as embedding projector.

License

Notifications You must be signed in to change notification settings

Ximilar-com/tf-metric-learning

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tf-metric-learning

TensorFlow 2.2 Python 3.6

Overview

Minimalistic open-source library for metric learning written in TensorFlow2, TF-Addons, Numpy, OpenCV(CV2) and Annoy. This repository contains a TensorFlow2+/tf.keras implementation some of the loss functions and miners. This repository was inspired by pytorch-metric-learning.

Installation

Prerequirements:

pip install tensorflow
pip install tensorflow-addons
pip install annoy
pip install opencv-contrib-python

This library:

pip install tf-metric-learning

Features

  • All the loss functions are implemented as tf.keras.layers.Layer
  • Callbacks for Computing Recall, Visualize Embeddings in TensorBoard Projector
  • Simple Mining mechanism with Annoy
  • Combine multiple loss functions/layers in one model

Open-source repos

This library contains code that has been adapted and modified from the following great open-source repos, without them this will be not possible (THANK YOU):

TODO

  • Discriminative layer optimizer (different learning rates) for Loss with weights (Proxy, SoftTriple, ...) TODO
  • Some Tests 😇
  • Improve and add more minerss

Examples

import tensorflow as tf
import numpy as np

from tf_metric_learning.layers import SoftTripleLoss
from tf_metric_learning.utils.constants import EMBEDDINGS, LABELS

num_class, num_centers, embedding_size = 10, 2, 256

inputs = tf.keras.Input(shape=(embedding_size), name=EMBEDDINGS)
input_label = tf.keras.layers.Input(shape=(1,), name=LABELS)
output_tensor = SoftTripleLoss(num_class, num_centers, embedding_size)({EMBEDDINGS:inputs, LABELS:input_label})

model = tf.keras.Model(inputs=[inputs, input_label], outputs=output_tensor)
model.compile(optimizer="adam")

data = {EMBEDDINGS : np.asarray([np.zeros(256) for i in range(1000)]), LABELS: np.zeros(1000, dtype=np.float32)}
model.fit(data, None, epochs=10, batch_size=10)

More complex scenarios:

Features

Loss functions

Miners

  • MaximumLossMiner [TODO]
  • TripletAnnoyMiner ✅

Evaluators

  • AnnoyEvaluator Callback: for evaluation Recall@K, you will need to install Spotify annoy library.
import tensorflow as tf
from tf_metric_learning.utils.recall import AnnoyEvaluatorCallback

evaluator = AnnoyEvaluatorCallback(
    base_network,
    {"images": test_images[:divide], "labels": test_labels[:divide]}, # images stored to index
    {"images": test_images[divide:], "labels": test_labels[divide:]}, # images to query
    normalize_fn=lambda images: images / 255.0,
    normalize_eb=True,
    eb_size=embedding_size,
    freq=1,
)

Visualizations

  • Tensorboard Projector Callback
import tensorflow as tf
from tf_metric_learning.utils.projector import TBProjectorCallback

def normalize_images(images):
    return images/255.0

(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.cifar10.load_data()
...

projector = TBProjectorCallback(
    base_model,
    "tb/projector",
    test_images, # list of images
    np.squeeze(test_labels),
    normalize_eb=True,
    normalize_fn=normalize_images
)

About

Minimalistic TensorFlow2+ deep metric/similarity learning library with loss functions, miners, and utils as embedding projector.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

Morty Proxy This is a proxified and sanitized view of the page, visit original site.