Ir para o console

FirebaseModelInterpreter

public final class FirebaseModelInterpreter extends Object
implements Closeable

Interpreter to run custom models with TensorFlow Lite (requires API level 16+)

A model interpreter is created via getInstance(FirebaseModelInterpreterOptions) Follow the steps below to specify the FirebaseCustomRemoteModel or FirebaseCustomLocalModel, create a FirebaseModelInterpreterOptions and then create a FirebaseModelInterpreter and all the way to running an inference with the model.

Firstly, create at least one of FirebaseCustomRemoteModel and FirebaseCustomLocalModel. If you want to use FirebaseCustomRemoteModel, you need to download it through FirebaseModelManager first.

The following code creates the model and the interpreterOptions:



 FirebaseCustomRemoteModel remoteModel = new FirebaseCustomRemoteModel.Builder(REMOTE_MODEL_NAME)
     .build();

 FirebaseModelInterpreterOptions interpreterOptions =
      new FirebaseModelInterpreterOptions.Builder(remoteModel).build();

 Or

 FirebaseCustomLocalModel localModel = new FirebaseCustomLocalModel.Builder()
     .setFilePath(filePath).build();

 FirebaseModelInterpreterOptions interpreterOptions =
     new FirebaseModelInterpreterOptions.Builder(localModel).build();

 

Secondly, create an instance of FirebaseModelInterpreter with the FirebaseModelInterpreterOptions


 FirebaseModelInterpreter interpreter = FirebaseModelInterpreter.getInstance(interpreterOptions);
 

Thirdly, create a FirebaseModelInputs and a FirebaseModelInputOutputOptions. You need to know the input and output data specifications of the model, including the data types and input/output dimensions. The following code is an example:


 FirebaseModelInputs modelInputs = new FirebaseModelInputs.Builder().add(...).build();

 FirebaseModelInputOutputOptions inputOutputOptions =
     new FirebaseModelInputOutputOptions.Builder()
         .setInputFormat(0, FirebaseModelDataType.FLOAT32, inputDims)
         .setOutputFormat(0, FirebaseModelDataType.FLOAT32, outputDims)
         .build();
 

Lastly, feed the inputs to run(FirebaseModelInputs, FirebaseModelInputOutputOptions). The following code is an example:


 Task<FirebaseModelOutputs> task = interpreter.run(modelInputs, inputOutputOptions);
 task.addOnSuccessListener(...).addOnFailureListener(...);
 

Public Method Summary

void
Task<Integer>
getInputIndex(String name)
Gets the index of an input by its name.
static FirebaseModelInterpreter
getInstance(FirebaseModelInterpreterOptions options)
Gets an instance of the FirebaseModelInterpreter to support custom model specified in FirebaseModelOptions.
Task<Integer>
getOutputIndex(String name)
Gets the index of an output by its name.
boolean
isStatsCollectionEnabled()
Determines whether stats collection in Model Interpreter is enabled.
Task<FirebaseModelOutputs>
run(FirebaseModelInputs inputs, FirebaseModelInputOutputOptions options)
Runs inference with input and data configurations.
void
setStatsCollectionEnabled(boolean enable)
Enables stats collection in ML Kit model interpreter.

Inherited Method Summary

Public Methods

public void close ()

public Task<Integer> getInputIndex (String name)

Gets the index of an input by its name.

Parameters
name the name of the input
Returns

public static FirebaseModelInterpreter getInstance (FirebaseModelInterpreterOptions options)

Gets an instance of the FirebaseModelInterpreter to support custom model specified in FirebaseModelOptions.

Parameters
options the options for the model to use.
Returns
  • an instance of FirebaseModelInterpreter. Note that the interpreter instance will be the same instance if the supplied options are the same.

public Task<Integer> getOutputIndex (String name)

Gets the index of an output by its name.

Parameters
name the name of the output
Returns

public boolean isStatsCollectionEnabled ()

Determines whether stats collection in Model Interpreter is enabled.

Returns
  • true if stats collection is enabled.