Fedsim Scores#

class Accuracy(log_freq: int = 1, split='test', score_name='accuracy', reduction: str = 'micro')[source]#

updatable accuracy score

__call__(input, target) torch.Tensor[source]#

updates the accuracy score on a mini-batch detached from the computational graph. It also returns the current batch score without detaching from the graph.

Parameters
  • input (Tensor) -- Predicted unnormalized scores (often referred to aslogits); see Shape section below for supported shapes.

  • target (Tensor) -- Ground truth class indices or class probabilities; see Shape section below for supported shapes.

Shape:

  • Input: Shape \((N, C)\).

  • Target: shape \((N)\) where each

    value should be between \([0, C)\).

where:

\[\begin{split}\begin{aligned} C ={} & \text{number of classes} \\ N ={} & \text{batch size} \\ \end{aligned}\end{split}\]
Returns

Tensor -- accuracy score of current batch

Parameters
  • log_freq (int, optional) -- how many steps gap between two evaluations. Defaults to 1.

  • split (str, optional) -- data split to evaluate on . Defaults to 'test'.

  • score_name (str) -- name of the score object

  • reduction (str) -- Specifies the reduction to apply to the output: 'micro' | 'macro'. 'micro': as if mini-batches are concatenated. 'macro': mean of accuracy of each mini-batch (update). Default: 'micro'

get_score() float[source]#

returns the score

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

float -- the score

is_differentiable() bool[source]#

to check if the score is differentiable (to for ex. use as loss function).

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

bool -- True if the output of the call is differentiable.

reset() None[source]#

resets the internal buffers, makes it ready to start collecting

Raises

NotImplementedError -- This abstract method should be implemented by child classes

class CrossEntropyScore(log_freq: int = 1, split='test', score_name='cross_entropy_score', weight=None, reduction: str = 'micro', label_smoothing: float = 0.0)[source]#

updatable cross entropy score

__call__(input, target) torch.Tensor[source]#

updates the cross entropy score on a mini-batch detached from the computational graph. It also returns the current batch score without detaching from the graph.

Parameters
  • input (Tensor) -- Predicted unnormalized scores (often referred to aslogits); see Shape section below for supported shapes.

  • target (Tensor) -- Ground truth class indices or class probabilities; see Shape section below for supported shapes.

Shape:

  • Input: shape \((C)\), \((N, C)\).

  • Target: shape \(()\), \((N)\) where each

    value should be between \([0, C)\).

where:

\[\begin{split}\begin{aligned} C ={} & \text{number of classes} \\ N ={} & \text{batch size} \\ \end{aligned}\end{split}\]
Returns

Tensor -- cross entropy score of current batch

Parameters
  • log_freq (int, optional) -- how many steps gap between two evaluations. Defaults to 1.

  • split (str, optional) -- data split to evaluate on . Defaults to 'test'.

  • score_name (str) -- name of the score object

  • reduction (str) -- Specifies the reduction to apply to the output:

  • ``'micro'`` | ``'macro'``. ``'micro'`` -- as if mini-batches are

  • concatenated. ``'macro'`` -- mean of cross entropy of each mini-batch

  • (update). Default -- 'micro'

get_score() float[source]#

returns the score

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

float -- the score

is_differentiable() bool[source]#

to check if the score is differentiable (to for ex. use as loss function).

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

bool -- True if the output of the call is differentiable.

reset() None[source]#

resets the internal buffers, makes it ready to start collecting

Raises

NotImplementedError -- This abstract method should be implemented by child classes

class KLDivScore(log_freq: int = 1, split='test', score_name='kl_dic_score', reduction: str = 'micro', log_target=False)[source]#

updatable pointwise KL-divergence score

__call__(input, target) torch.Tensor[source]#

updates the KL-divergence score on a mini-batch detached from the computational graph. It also returns the current batch score without detaching from the graph.

Parameters
  • input (Tensor) -- Predicted unnormalized scores (often referred to aslogits); see Shape section below for supported shapes.

  • target (Tensor) -- Ground truth class indices or class probabilities; see Shape section below for supported shapes.

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Target: \((*)\), same shape as the input.

  • Output: scalar by default. If reduction is 'none',

    then \((*)\), same shape as the input.

Returns

Tensor -- KL-divergence score of current batch

Parameters
  • log_freq (int, optional) -- how many steps gap between two evaluations. Defaults to 1.

  • split (str, optional) -- data split to evaluate on . Defaults to 'test'.

  • score_name (str) -- name of the score object

  • reduction (str) -- Specifies the reduction to apply to the output:

  • ``'micro'`` | ``'macro'``. ``'micro'`` -- as if mini-batches are

  • concatenated. ``'macro'`` -- mean of cross entropy of each mini-batch

  • (update). Default -- 'micro'

get_score() float[source]#

returns the score

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

float -- the score

is_differentiable() bool[source]#

to check if the score is differentiable (to for ex. use as loss function).

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

bool -- True if the output of the call is differentiable.

reset() None[source]#

resets the internal buffers, makes it ready to start collecting

Raises

NotImplementedError -- This abstract method should be implemented by child classes

class Score(log_freq: int = 1, split='test', score_name='', reduction='micro')[source]#

Score base class.

__call__(input, target)[source]#

updates the score based on a mini-batch of input and target

Parameters
  • input (Tensor) -- Predicted unnormalized scores (often referred to aslogits); see Shape section below for supported shapes.

  • target (Tensor) -- Ground truth class indices or class probabilities; see Shape section below for supported shapes.

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Parameters
  • log_freq (int, optional) -- how many steps gap between two evaluations. Defaults to 1.

  • split (str, optional) -- data split to evaluate on . Defaults to 'test'.

  • score_name (str) -- name of the score object

  • reduction (str) -- Specifies the reduction to apply to the output: 'micro' | 'macro'. 'micro': as if mini-batches are concatenated. 'macro': mean of score of each mini-batch (update). Default: 'micro'

get_name() str[source]#

gives the name of the score

Returns

str -- score name

get_score() float[source]#

returns the score

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

float -- the score

is_differentiable() bool[source]#

to check if the score is differentiable (to for ex. use as loss function).

Raises

NotImplementedError -- This abstract method should be implemented by child classes

Returns

bool -- True if the output of the call is differentiable.

reset() None[source]#

resets the internal buffers, makes it ready to start collecting

Raises

NotImplementedError -- This abstract method should be implemented by child classes