public final class ImageInferenceContext extends Object
These may be helpful for decoding output from the inference.
Constructor and Description |
---|
ImageInferenceContext(DualScale<Dimensions> dimensions,
ScaleFactor scaleFactor,
Optional<List<String>> classLabels,
VoxelsResizer resizer,
ExecutionTimeRecorder executionTimeRecorder,
Logger logger) |
Modifier and Type | Method and Description |
---|---|
boolean |
equals(Object o) |
Optional<List<String>> |
getClassLabels()
If available, labels of classes loaded from a text file at
classLabelsPath . |
DualScale<Dimensions> |
getDimensions()
The size of the image for which we wish to segment, before and after any scaling for model
inference.
|
ExecutionTimeRecorder |
getExecutionTimeRecorder()
Allows execution-time for particular operations to be recorded.
|
Logger |
getLogger()
Where to log information messages during inference.
|
VoxelsResizer |
getResizer()
How to resize images or voxel-buffers.
|
ScaleFactor |
getScaleFactor()
The inverse of the scaling-factor applied to reduce
unscaledDimensions to the
input-matrix used for inference. |
int |
hashCode() |
DualScale<Optional<ScaleFactor>> |
scaleFactorUpscale()
The scaling-factors needed to upscale the model output to match the desired scale.
|
String |
toString() |
public ImageInferenceContext(DualScale<Dimensions> dimensions, ScaleFactor scaleFactor, Optional<List<String>> classLabels, VoxelsResizer resizer, ExecutionTimeRecorder executionTimeRecorder, Logger logger)
public final DualScale<Optional<ScaleFactor>> scaleFactorUpscale()
DualScale
by, so they become identically sized to the DualScale.atInputScale()
.public DualScale<Dimensions> getDimensions()
public ScaleFactor getScaleFactor()
unscaledDimensions
to the
input-matrix used for inference.public Optional<List<String>> getClassLabels()
classLabelsPath
.public VoxelsResizer getResizer()
public ExecutionTimeRecorder getExecutionTimeRecorder()
public Logger getLogger()
Copyright © 2010–2023 Owen Feehan, ETH Zurich, University of Zurich, Hoffmann-La Roche. All rights reserved.