Uses of Package
org.anchoranalysis.inference.concurrency
Packages that use org.anchoranalysis.inference.concurrency
Package
Description
Defines a task,
which is a set of operations run on an input, that may or may not generate an output.
Instance-segmentation of a
Stack
.Specifying how many CPUs and GPUs can be allocated for some purpose.
Beans to combine two or more annotations.
Task to compare a set of annotations to a segmentation or another set of annotations.
Combining multiple images together into a single image.
Tasks pertaining to
Feature
s.Tasks for converting image-formats.
Tasks that involved stacks (usually each channel from an image) that are somehow
grouped-together.
Associating labels with images.
Tasks to scale an image.
Tasks to segment an image.
Tasks that process on one or more slices from a z-stack.
Non-bean classes for image-segmentation.
Implementations of
Task
related to file-system
I/O.Converting from
NamedChannelsInput
to the
input-type expected by a Task
.Segmenting a
Stack
using the ONNX Runtime to produce
an ObjectCollection
.Segmentation of a
Stack
using OpenCV.-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.experiment.bean.taskClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.image.inference.bean.segment.instanceClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.Keeps concurrent copies of a model to be used by different threads.When creating a model to be used for inference fails.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.inference.concurrencyClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.An instance of model that can be used concurrently for inference.This exception indicates that an error occurred when performing inference from a model concurrently.When creating a model to be used for inference fails.Creates a model to use in the pool.Wraps an element of type
T
to ensure priority is given when the flaggpu==true
. -
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.annotation.bean.aggregateClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.annotation.bean.comparisonClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.combineClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.featureClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.formatClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.groupedClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.labellerClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.scaleClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.segmentClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.bean.sliceClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.image.task.segmentClassDescriptionKeeps concurrent copies of a model to be used by different threads.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.io.bean.taskClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.mpp.bean.convertClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.onnx.bean.object.segment.stackClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.Keeps concurrent copies of a model to be used by different threads.When creating a model to be used for inference fails.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.plugin.opencv.bean.object.segment.stackClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.Keeps concurrent copies of a model to be used by different threads.When creating a model to be used for inference fails.