कंसोल पर जाएं

FirebaseModelInterpreter

public final class FirebaseModelInterpreter extends Object
implements Closeable

Interpreter to run custom models with TensorFlow Lite (requires API level 16+)

A model interpreter is created via getInstance(FirebaseModelOptions). Follow the steps below to specify the FirebaseModelOptions, create a FirebaseModelInterpreter and all the way to running an inference with the model.

Firstly, create at least one of FirebaseRemoteModel and FirebaseLocalModel, and register them with FirebaseModelManager. Note that the FirebaseRemoteModel has higher precedence, which means the FirebaseLocalModel will only be used if there is no FirebaseRemoteModel or the download of FirebaseRemoteModel fails. It is recommended to specify both local and remote model in case the download of FirebaseRemoteModel fails or the internet is slow.

The following code creates and registers the model:


 FirebaseModelManager.getInstance()
  .registerRemoteModel(
      new FirebaseRemoteModel.Builder(remoteModelName)
          .setInitialDownloadConditions(initialConditions)
          .setUpdatesDownloadConditions(updateConditions)
          .build());

 FirebaseModelManager.getInstance()
  .registerLocalModel(
      new FirebaseLocalModel.Builder(localModelName)
          .setFilePath(localModelFilePath)
          .build());

 FirebaseModelOptions modelOptions =
    new FirebaseModelOptions.Builder()
        .setRemoteModelName(remoteModelName)
        .setLocalModelName(localModelName)
        .build();
 

Secondly, create an instance of FirebaseModelInterpreter with the FirebaseModelOptions.


 FirebaseModelInterpreter interpreter = FirebaseModelInterpreter.getInstance(modelOptions);
 

Thirdly, create a FirebaseModelInputs and a FirebaseModelInputOutputOptions. You need to know the input and output data specifications of the model, including the data types and input/output dimensions. The following code is an example:


 FirebaseModelInputs modelInputs = new FirebaseModelInputs.Builder().add(...).build();

 FirebaseModelInputOutputOptions inputOutputOptions =
     new FirebaseModelInputOutputOptions.Builder()
         .setInputFormat(0, FirebaseModelDataType.FLOAT32, inputDims)
         .setOutputFormat(0, FirebaseModelDataType.FLOAT32, outputDims)
         .build();
 

Lastly, feed the inputs to run(FirebaseModelInputs, FirebaseModelInputOutputOptions). The following code is an example:


 Task<FirebaseModelOutputs> task = interpreter.run(modelInputs, inputOutputOptions);
 task.addOnSuccessListener(...).addOnFailureListener(...);
 

Public Method Summary

void
Task<Integer>
getInputIndex(String name)
Gets the index of an input by its name.
synchronized static FirebaseModelInterpreter
getInstance(FirebaseModelOptions options)
Gets an instance of the FirebaseModelInterpreter to support custom model specified in FirebaseModelOptions.
Task<Integer>
getOutputIndex(String name)
Gets the index of an output by its name.
boolean
isStatsCollectionEnabled()
Determines whether stats collection in Model Interpreter is enabled.
Task<