Interpreter to run custom models with TensorFlow Lite (requires API level 16+)
A model interpreter is created via
getInstance(FirebaseModelOptions)
. Follow the steps below to specify the
FirebaseModelOptions
,
create a FirebaseModelInterpreter
and all the way to running an inference with the model.
Firstly, create at least one of
FirebaseCloudModelSource
and
FirebaseLocalModelSource
, and register them with
FirebaseModelManager
. Note that the
FirebaseCloudModelSource
has higher precedence, which means the
FirebaseLocalModelSource
will only be used if there is no
FirebaseCloudModelSource
or the download of
FirebaseCloudModelSource
fails. It is recommended to specify both local and cloud
model source in case the download of
FirebaseCloudModelSource
fails or the internet is slow.
The following code creates and registers the model sources:
FirebaseModelManager.getInstance()
.registerCloudModelSource(
new FirebaseCloudModelSource.Builder(cloudModelName)
.setInitialDownloadConditions(initialConditions)
.setUpdatesDownloadConditions(updateConditions)
.build());
FirebaseModelManager.getInstance()
.registerLocalModelSource(
new FirebaseLocalModelSource.Builder(localModelName)
.setFilePath(localModelFilePath)
.build());
FirebaseModelOptions modelOptions =
new FirebaseModelOptions.Builder()
.setCloudModelName(cloudModelName)
.setLocalModelName(localModelName)
.build();
Secondly, create an instance of FirebaseModelInterpreter
with the FirebaseModelOptions
.
FirebaseModelInterpreter interpreter = FirebaseModelInterpreter.getInstance(modelOptions);
Thirdly, create a FirebaseModelInputs
and a FirebaseModelInputOutputOptions
.
You need to know the input and output data specifications of the model, including the data
types and input/output dimensions. The following code is an example:
FirebaseModelInputs modelInputs = new FirebaseModelInputs.Builder().add(...).build();
FirebaseModelInputOutputOptions inputOutputOptions =
new FirebaseModelInputOutputOptions.Builder()
.setInputFormat(0, FirebaseModelDataType.FLOAT32, inputDims)
.setOutputFormat(0, FirebaseModelDataType.FLOAT32, outputDims)
.build();
Lastly, feed the inputs to
run(FirebaseModelInputs, FirebaseModelInputOutputOptions)
. The following code is
an example:
Task
<FirebaseModelOutputs> task = interpreter.run(modelInputs, inputOutputOptions);
task.addOnSuccessListener(...).addOnFailureListener(...);
Public Method Summary
void |
close()
|
Task<Integer> | |
synchronized static FirebaseModelInterpreter |
getInstance(FirebaseModelOptions
options)
Gets an instance of the
FirebaseModelInterpreter to support custom
model specified in FirebaseModelOptions .
|
Task<Integer> | |
boolean |
isStatsCollectionEnabled()
Determines whether stats collection in Model Interpreter is enabled.
|
Task<FirebaseModelOutputs> |
run(FirebaseModelInputs
inputs,
FirebaseModelInputOutputOptions options)
Runs inference with input and data configurations.
|
void |
setStatsCollectionEnabled(boolean enable)
Enables stats collection in ML Kit model interpreter.
|
Inherited Method Summary
Public Methods
public void close ()
Throws
IOException |
---|
public Task<Integer> getInputIndex (String name)
Gets the index of an input by its name.
Parameters
name | the name of the input |
---|
Returns
- a
Task
for the index of the input. TheTask
will fail withFirebaseMLException
if the input name does not exist in the model.
public static synchronized FirebaseModelInterpreter getInstance (FirebaseModelOptions options)
Gets an instance of the FirebaseModelInterpreter
to support custom
model specified in FirebaseModelOptions
. This only works for API level 16
(JELLY_BEAN) and above.
Parameters
options | the options for the model to use. |
---|
Returns
- an instance of
FirebaseModelInterpreter
. Note that the interpreter instance will be the same instance if the supplied options are the same.
Throws
IllegalArgumentException | if FirebaseModelOptions
contains no model source |
---|---|
FirebaseMLException |
public Task<Integer> getOutputIndex (String name)
Gets the index of an output by its name.
Parameters
name | the name of the output |
---|
Returns
- a
Task
for the index of the output name. TheTask
will fail withFirebaseMLException
if the output name does not exist in the model.
public boolean isStatsCollectionEnabled ()
Determines whether stats collection in Model Interpreter is enabled.
Returns
- true if stats collection is enabled.
public Task<FirebaseModelOutputs> run (FirebaseModelInputs inputs, FirebaseModelInputOutputOptions options)
Runs inference with input and data configurations.
Parameters
inputs | an instance of FirebaseModelInputs
containing input data |
---|---|
options | an instance of
FirebaseModelInputOutputOptions containing types and dimensions of input
and output data. |
Returns
- a Task of
FirebaseModelOutputs
.
Throws
FirebaseMLException |
---|
public void setStatsCollectionEnabled (boolean enable)
Enables stats collection in ML Kit model interpreter. The stats include API calls counts, errors, API call durations, options, etc. No personally identifiable information is logged.
The setting is per FirebaseApp
,
and it is persistent together with app's private data. It means if the user uninstalls
the app or clears all app data, the setting will be erased. The best practice is to set
the flag in each initialization.
By default the logging is enabled. You have to specifically set it to false here to
disable logging for FirebaseModelInterpreter
.