evaluation module

The evaluation module was designed to provide functions that implement numerical metrics and visualisations during training and inference, as well as callbacks.

Callbacks

class evaluation.TensorboardImage(valid_generator, denoiser, preprocess='clip')[source]

Bases: keras.callbacks.callbacks.Callback

Tensorboard Keras callback. At each epoch end, it plots on Tensorboard metrics/loss information on validation data, as well as a denoising summary.

valid_generator

Image dataset generator. Provide validation data for model evaluation.

Type:data.AbstractDatasetGenerator
denoiser

Image denoising object.

Type:model.AbstractDeepLearningModel
folder_string

String containing the path to logging directory. Corresponds to denoiser’s name.

Type:str
__init__(self, valid_generator, denoiser, preprocess='clip')[source]

Initialize self. See help(type(self)) for accurate signature.

class evaluation.LrSchedulerCallback[source]

Bases: keras.callbacks.callbacks.Callback

Custom Learning Rate scheduler based on Keras. This class is mainly used for compatibility between TfModel, PytorchModel and KerasModel. Please note that this class should not be used as a LearningRateScheduler. To specify one, you need to use a class that inherits from LrSchedulerCallback.

__init__(self)[source]

Initialize self. See help(type(self)) for accurate signature.

on_epoch_end(self, epoch, logs=None)[source]

Called at the end of an epoch.

Subclasses should override for any actions to run. This function should only be called during train mode.

# Arguments

epoch: integer, index of epoch. logs: dict, metric results for this training epoch, and for the

validation epoch if validation is performed. Validation result keys are prefixed with val_.
class evaluation.DnCNNSchedule(initial_lr=0.001)[source]

Bases: evaluation.callbacks.LrSchedulerCallback

DnCNN learning rate decay scheduler as specified in the original paper.

After epoch 30, drops the initial learning rate by a factor of 10. After epoch 60, drops the initial learning rate by a factor of 20.

initial_lr

Initial learning rate value.

Type:float
__init__(self, initial_lr=0.001)[source]

Initialize self. See help(type(self)) for accurate signature.

class evaluation.StepSchedule(initial_lr=0.001, factor=0.5, dropEvery=10)[source]

Bases: evaluation.callbacks.LrSchedulerCallback

Drops the learning rate at each ‘dropEvery’ iterations by a factor of ‘factor’.

initial_lr

Initial Learning Rate.

Type:float
factor

Decay factor.

Type:float
dropEvery

The learning rate will be decayed periodically, where the period is defined by dropEvery.

Type:int
__init__(self, initial_lr=0.001, factor=0.5, dropEvery=10)[source]

Initialize self. See help(type(self)) for accurate signature.

class evaluation.PolynomialSchedule(initial_lr=0.001, maxEpochs=100, power=1.0)[source]

Bases: evaluation.callbacks.LrSchedulerCallback

Drops the learning rate following a polynomial schedule:

\[\alpha = \alpha_{0}\biggr(1 - \dfrac{epoch}{maxEpochs}\biggr)^{power}\]
initial_lr

Initial Learning Rate.

Type:float
maxEpochs

At the end of maxEpochs, the learning_rate will be zero.

Type:int
power

Polynomial power.

Type:int
__init__(self, initial_lr=0.001, maxEpochs=100, power=1.0)[source]

Initialize self. See help(type(self)) for accurate signature.

class evaluation.ExponentialSchedule(initial_lr=0.001, gamma=0.5)[source]

Bases: evaluation.callbacks.LrSchedulerCallback

Drops the learning rate following a exponential schedule:

\[\alpha = \alpha_{0}\times\gamma^{epoch}\]
initial_lr

Initial Learning Rate.

Type:float
factor

Rate at which the learning rate is decayed at each epoch.

Type:float
__init__(self, initial_lr=0.001, gamma=0.5)[source]

Initialize self. See help(type(self)) for accurate signature.

class evaluation.CheckpointCallback(denoiser, monitor='loss', mode='max', period=1)[source]

Bases: keras.callbacks.callbacks.Callback

Creates training checkpoints for Deep Learning models.

denoiser

Denoiser object to be saved.

Type:model.AbstractDenoiser
monitor

Name of the metric being tracked. If the name is not present on logs, tracks the loss value.

Type:str
mode

String having one of these two values: {‘max’, ‘min’}. If it is ‘max’, saves the model with the greater metric value. If it is ‘min’, saves the model with the smaller metric value.

Type:str
period

Saves models at uniform intervals specified by period.

Type:int
logdir

String containing the path to the logs directory.

Type:str
__init__(self, denoiser, monitor='loss', mode='max', period=1)[source]

Initialize self. See help(type(self)) for accurate signature.

Metrics

class evaluation.Metric(name, tf_metric=None, np_metric=None)[source]

Bases: object

The metric class is used for interfacing between tensorflow and numpy metrics.

Notes

This class is recommended instead of directly using functions, because functions that work with tensors are not recommended to act on numpy arrays (for each time they are called on numpy arrays, a new tensor is created, thus causing memory overflow). See Examples for more informations. We remark that, for inference on the benchmark, you should always specify np metrics.

tf_metric

Tensorflow function implementing the metric on tensors.

Type:function
np_metric

Numpy function implementing metric on ndarrays.

Type:function

Examples

The most basic usage of Metric class is when you have functions for processing tensors and numpy arrays. We provide as built-in metrics SSIM, PSNR and MSE. For instance,

>>> import numpy as np
>>> import tensorflow as tf
>>> from OpenDenoising.evaluation import Metric, tf_ssim, skimage_ssim
>>> ssim = Metric(name="SSIM", tf_metric=tf_ssim, np_metric=skimage_ssim)
>>> x = tf.placeholder(tf.float32, [None, None, None, 1])
>>> y = tf.placeholder(tf.float32, [None, None, None, 1])
>>> ssim(x, y)
<tf.Tensor 'Mean_3:0' shape=() dtype=float32>
>>> x_np = np.random.randn(10, 256, 256, 1)
>>> y_np = np.random.randn(10, 256, 256, 1)
>>> ssim(x_np, y_np)
0.007000506155677978

That is, if you have specified the two metrics, the class handles if the result is a tensor, or a numeric value.

__call__(self, y_true, y_pred)[source]

Call self as a function.

__init__(self, name, tf_metric=None, np_metric=None)[source]

Initialize self. See help(type(self)) for accurate signature.

__str__(self)[source]

Return str(self).

Tensorflow Metrics

evaluation.tf_ssim(y_true, y_pred)[source]

Structural Similarity Index.

Parameters:
  • y_true (tf.Tensor) – Tensor corresponding to ground-truth images (clean).
  • y_pred (tf.Tensor) – Tensor corresponding to the Network’s prediction.
Returns:

Tensor corresponding to the evaluated metric.

Return type:

tf.Tensor

evaluation.tf_mse(y_true, y_pred)[source]

Mean Squared Error.

\[MSE = \dfrac{1}{N \times H \times W \times C}\sum_{n=0}^{N}\sum_{i=0}^{H}\sum_{j=0}^{W}\sum_{k=0}^{C}(y_{true} (n, i, j, k)-y_{pred}(n, i, j, k))^{2}\]
Parameters:
  • y_true (tf.Tensor) – Tensor corresponding to ground-truth images (clean).
  • y_pred (tf.Tensor) – Tensor corresponding to the Network’s prediction.
Returns:

Tensor corresponding to the evaluated metric.

Return type:

tf.Tensor

evaluation.tf_psnr(y_true, y_pred)[source]

Peak Signal to Noise Ratio loss.

\[PSNR = \dfrac{10}{N}\sum_{n=0}^{N}log_{10}\biggr(\dfrac{max(y_{true}(n)^{2})}{MSE(y_{true}(n), y_{pred}(n))}\biggr)\]
Parameters:
  • y_true (tf.Tensor) – Tensor corresponding to ground-truth images (clean).
  • y_pred (tf.Tensor) – Tensor corresponding to the Network’s prediction.
Returns:

Tensor corresponding to the evaluated metric.

Return type:

tf.Tensor

evaluation.tf_se(y_true, y_pred)[source]

Squared Error loss.

\[SE = \sum_{n=0}^{N}\sum_{i=0}^{H}\sum_{j=0}^{W}\sum_{k=0}^{C}(y_{true}(n, i, j, k)-y_{pred}(n, i, j, k))^{2}\]
Parameters:
  • y_true (tf.Tensor) – Tensor corresponding to ground-truth images (clean).
  • y_pred (tf.Tensor) – Tensor corresponding to the Network’s prediction.
Returns:

Tensor corresponding to the evaluated metric.

Return type:

tf.Tensor

Skimage Metrics

evaluation.skimage_ssim(y_true, y_pred)[source]

Skimage SSIM wrapper.

Parameters:
  • y_true (numpy.ndarray) – 4D numpy array containing ground-truth images.
  • y_pred (numpy.ndarray) – 4D numpy array containing the Network’s prediction.
Returns:

Scalar value of SSIM between y_true and y_pred

Return type:

float

evaluation.skimage_mse(y_true, y_pred)[source]

Skimage MSE wrapper.

Parameters:
  • y_true (numpy.ndarray) – 4D numpy array containing ground-truth images.
  • y_pred (numpy.ndarray) – 4D numpy array containing the Network’s prediction.
Returns:

Scalar value of MSE between y_true and y_pred

Return type:

float

evaluation.skimage_psnr(y_true, y_pred)[source]

Skimage PSNR wrapper.

Parameters:
  • y_true (numpy.ndarray) – 4D numpy array containing ground-truth images.
  • y_pred (numpy.ndarray) – 4D numpy array containing the Network’s prediction.
Returns:

Scalar value of PSNR between y_true and y_pred

Return type:

float

Visualisations

class evaluation.Visualisation(func, name)[source]

Bases: object

Wraps visualisation functions.

func

Reference to a function that will create the plot.

Type:function
name

Visualisation’s name.

Type:str
__call__(self, file_dir, **kwargs)[source]
Parameters:
  • file_dir (str) – Path to the directory containing the files that will be used for constructing the visualisation.
  • kwargs (dict) – Keyword arguments. Used for passing optional arguments for the visualisation.

Examples

Assuming you are output your benchmark results to “./results”, and that you have a Benchmark with name “MyBenchTests”, you can use the boxplot function to generate visualisations from the output .csv files.

>>>from OpenDenoising.evaluation import Visualisation, boxplot >>>vis = Visualisation(func=boxplot, name=”boxplot_PSNR”) >>>vis(file_dir=”./results/”)

__init__(self, func, name)[source]

Initialize self. See help(type(self)) for accurate signature.

__str__(self)[source]

Return str(self).

Functions

evaluation.boxplot(csv_dir, output_dir, metric='PSNR', show=True)[source]

Wraps Seaborn boxplot function.

Parameters:
  • csv_dir (str) – String containing the path to CSV file directory holding the data.
  • output_dir (str) – String containing the path to save the image.
  • metric (str) – String containing the name of the metric being shown by the plot.
  • show (bool) – If bool is True, shows the plot rather than only saving it to output_path.