Inference Scheduling Inference Scheduling Inferencing on demand, continuous inferencing Run multiple AI models in parallel: using the same imager/sensor inputs using logical operators to infer decisions