Java Inference SDK

After you deploy a Qwak-based model, your JVM-based client applications can use this module to get inferences from the model hosted as a real-time endpoint.

Inference Example

The following example invokes the model test_model. The model accepts one feature vector which contains three fields and produces one output field named "score".

RealtimeClient client = RealtimeClient.builder()

PredictionResponse response = client.predict(PredictionRequest.builder()
                        .feature("feature_a", "feature_value")
                        .feature("feature_b", 1)
                        .feature("feature_c", 0.5)

Optional<PredictionResult> singlePrediction = response.getSinglePrediction();
double score = singlePrediction.get().getValueAsDouble("score");


The Java Inference SDK is hosted on Qwak's internal maven repository.

Maven Configuration

To set up a Maven-based application that uses the Java Inference SDK, add the following sections to the projects pom.xml:



     <name>Qwak Maven Repository</name>




Gradle Configuration

To set up a Gradle-based application that uses the Java Inference SDK, add the following sections to the projects build.gradle:

repositories {
  maven {
    url ""


dependencies {
    implementation ''

Model metadata

To retrieve the model metadata, use the ModelMetadataClient:


ModelMetadataClient client = ModelMetadataClient.builder()
  .apiKey("YOUR QWAK API KEY")
ModelMetadata metadata = client.getModelMetadata("MODEL NAME");

The ModelMetadata class has the following methods:

public Map<String, Object> getModel() # returns information about the model
public List<Map<String, Object>> getDeploymentDetails() # if the model is deployed, it returns data about deployment configuration
public Map<String, Map<String, Object>> getAudienceRoutesByEnvironment() # audience configuration per environment
public List<Map<String, Object>> getBuilds() # data about the DEPLOYED builds