Local Inference#
Inference for local client
- local_inference(model, data_loader, scores, device='cpu', transform_y=None)[source]#
to test the performance of a model on a test set.
- Parameters
model (Module) -- model to get the predictions from
data_loader (Iterable) -- inference data loader.
scores (Dict[str, Score]) -- scores to evaluate
device (str, optional) -- device to load the data into ("cpu", "cuda", or device ordinal number). This must be the same device as the one model parameters are loaded into. Defaults to "cpu".
transform_y (Callable, optional) -- a function that takes raw labels and modifies them. Defaults to None.
- Returns
int -- number of samples the evaluation is done for.