This class is deprecated.
      For more information refer to the custom model implementation
      instructions.
Interpreter to run custom models with TensorFlow Lite (requires API level 16+)
A model interpreter is created via 
      getInstance(FirebaseModelInterpreterOptions) Follow the steps below to specify the
      FirebaseCustomRemoteModel
      or FirebaseCustomLocalModel,
      create a FirebaseModelInterpreterOptions
      and then create a FirebaseModelInterpreter
      and all the way to running an inference with the model.
Firstly, create at least one of FirebaseCustomRemoteModel
      and FirebaseCustomLocalModel.
      If you want to use FirebaseCustomRemoteModel,
      you need to download it through FirebaseModelManager first.
The following code creates the model and the interpreterOptions:
 FirebaseCustomRemoteModel remoteModel = new FirebaseCustomRemoteModel.Builder(REMOTE_MODEL_NAME)
     .build();
 FirebaseModelInterpreterOptions interpreterOptions =
      new FirebaseModelInterpreterOptions.Builder(remoteModel).build();
 Or
 FirebaseCustomLocalModel localModel = new FirebaseCustomLocalModel.Builder()
     .setFilePath(filePath).build();
 FirebaseModelInterpreterOptions interpreterOptions =
     new FirebaseModelInterpreterOptions.Builder(localModel).build();
 Secondly, create an instance of FirebaseModelInterpreter
      with the FirebaseModelInterpreterOptions
 FirebaseModelInterpreter interpreter = FirebaseModelInterpreter.getInstance(interpreterOptions);
 Thirdly, create a FirebaseModelInputs
      and a FirebaseModelInputOutputOptions.
      You need to know the input and output data specifications of the model, including the data
      types and input/output dimensions. The following code is an example:
 FirebaseModelInputs modelInputs = new FirebaseModelInputs.Builder().add(...).build();
 FirebaseModelInputOutputOptions inputOutputOptions =
     new FirebaseModelInputOutputOptions.Builder()
         .setInputFormat(0, FirebaseModelDataType.FLOAT32, inputDims)
         .setOutputFormat(0, FirebaseModelDataType.FLOAT32, outputDims)
         .build();
 Lastly, feed the inputs to 
      run(FirebaseModelInputs, FirebaseModelInputOutputOptions). The following code is
      an example:
 Task<FirebaseModelOutputs> task = interpreter.run(modelInputs, inputOutputOptions);
 task.addOnSuccessListener(...).addOnFailureListener(...);
 Public Method Summary
| void | 
                  
                  close()
                 | 
| Task<Integer> | |
| static FirebaseModelInterpreter | 
                  
                  getInstance(FirebaseModelInterpreterOptions
                  options)
                   
                    Gets an instance of the  FirebaseModelInterpreterto support custom
                    model specified inFirebaseModelOptions. | 
| Task<Integer> | |
| boolean | 
                  
                  isStatsCollectionEnabled()
                   
                    Determines whether stats collection in Model Interpreter is enabled.
                   | 
| Task<FirebaseModelOutputs> | 
                  
                  run(FirebaseModelInputs
                  inputs, 
                  FirebaseModelInputOutputOptions options)
                   
                    Runs inference with input and data configurations.
                   | 
| void | 
                  
                  setStatsCollectionEnabled(boolean enable)
                   
                    Enables stats collection in ML Kit model interpreter.
                   | 
Inherited Method Summary
Public Methods
public void close ()
public Task<Integer> getInputIndex (String name)
Gets the index of an input by its name.
Parameters
| name | the name of the input | 
|---|
Returns
- a Taskfor the index of the input. TheTaskwill fail withFirebaseMLExceptionif the input name does not exist in the model.
public static FirebaseModelInterpreter getInstance (FirebaseModelInterpreterOptions options)
Gets an instance of the FirebaseModelInterpreter to support custom
            model specified in FirebaseModelOptions.
Parameters
| options | the options for the model to use. | 
|---|
Returns
- an instance of FirebaseModelInterpreter. Note that the interpreter instance will be the same instance if the supplied options are the same.
Throws
| IllegalArgumentException | if 
                FirebaseModelInterpreterOptionscontains no model | 
|---|---|
| FirebaseMLException | 
public Task<Integer> getOutputIndex (String name)
Gets the index of an output by its name.
Parameters
| name | the name of the output | 
|---|
Returns
- a Taskfor the index of the output name. TheTaskwill fail withFirebaseMLExceptionif the output name does not exist in the model.
public boolean isStatsCollectionEnabled ()
Determines whether stats collection in Model Interpreter is enabled.
Returns
- true if stats collection is enabled.
public Task<FirebaseModelOutputs> run (FirebaseModelInputs inputs, FirebaseModelInputOutputOptions options)
Runs inference with input and data configurations.
Parameters
| inputs | an instance of FirebaseModelInputscontaining input data | 
|---|---|
| options | an instance of 
                FirebaseModelInputOutputOptionscontaining types and dimensions of input
                and output data. | 
public void setStatsCollectionEnabled (boolean enable)
Enables stats collection in ML Kit model interpreter. The stats include API calls counts, errors, API call durations, options, etc. No personally identifiable information is logged.
The setting is per MlKitContext,
            and it is persistent together with app's private data. It means if the user uninstalls
            the app or clears all app data, the setting will be erased. The best practice is to set
            the flag in each initialization.
By default the logging is enabled. You have to specifically set it to false here to
            disable logging for FirebaseModelInterpreter.