See: Description
Interface | Description |
---|---|
CreateModelForPool<T extends InferenceModel> |
Creates a model to use in the pool.
|
Class | Description |
---|---|
ConcurrencyPlan |
How many allocated CPUs and CPUs can be used concurrently for inference.
|
ConcurrentModel<T extends InferenceModel> |
An instance of model that can be used concurrently for inference.
|
ConcurrentModelPool<T extends InferenceModel> |
Keeps concurrent copies of a model to be used by different threads.
|
WithPriority<T> |
Wraps an element of type
T to ensure priority is given when the flag gpu==true . |
Exception | Description |
---|---|
ConcurrentModelException |
This exception indicates that an error occurred when performing inference from a model
concurrently.
|
CreateModelFailedException |
When creating a model to be used for inference fails.
|
Copyright © 2010–2023 Owen Feehan, ETH Zurich, University of Zurich, Hoffmann-La Roche. All rights reserved.