Join us for Firebase Summit on November 10, 2021. Tune in to learn how Firebase can help you accelerate app development, release with confidence, and scale with ease. Register

部署和管理自定義模型

您可以使用 Firebase 控制台或 Firebase Admin Python 和 Node.js SDK 部署和管理自定義模型和 AutoML 訓練模型。如果您只想部署模型並偶爾更新它,通常使用 Firebase 控制台最簡單。在與構建管道集成、使用 Colab 或 Jupyter 筆記本以及其他工作流程時,Admin SDK 會很有幫助。

在 Firebase 控制台中部署和管理模型

TensorFlow Lite 模型

要使用 Firebase 控制台部署 TensorFlow Lite 模型:

  1. 打開火力地堡ML自定義模式頁在火力地堡控制台。
  2. 點擊添加自定義模式(或添加另一個模型)。
  3. 指定將被用來識別您的模型在火力地堡的項目,然後上傳TensorFlow精簡版模型文件(通常在結尾的名稱.tflite.lite )。

部署模型後,您可以在自定義頁面上找到它。從那裡,您可以完成諸如使用新文件更新模型、下載模型以及從項目中刪除模型等任務。

使用 Firebase Admin SDK 部署和管理模型

本節展示瞭如何使用 Admin SDK 完成常見的模型部署和管理任務。請參閱SDK參考的PythonNode.js的額外幫助。

對於使用SDK的示例,請參閱Python的快速啟動示例Node.js的快速啟動示例

在你開始之前

  1. 如果您還沒有一個火力地堡項目,創建在一個新的項目火力地堡控制台。然後,打開您的項目並執行以下操作:

    1. 設置頁面,創建一個服務帳戶和下載服務帳戶密鑰文件。確保此文件安全,因為它授予管理員訪問您的項目的權限。

    2. 在存儲頁面上,啟用雲存儲。記下您的存儲桶名稱。

      您需要一個 Cloud Storage 存儲分區來臨時存儲模型文件,同時將它們添加到您的 Firebase 項目。如果您使用 Blaze 計劃,您可以為此目的創建和使用默認存儲桶以外的存儲桶。

    3. 在火力地堡ML頁面,如果您還沒有啟用火力地堡ML點擊開始使用

  2. 谷歌API控制台,打開你的火力地堡項目,使火力地堡ML API。

  3. 安裝和初始化管理SDK

    初始化 SDK 時,請指定您的服務帳號憑據和要用於存儲模型的 Cloud Storage 存儲分區:

    Python

    import firebase_admin
    from firebase_admin import ml
    from firebase_admin import credentials
    
    firebase_admin.initialize_app(
      credentials.Certificate('/path/to/your/service_account_key.json'),
      options={
          'storageBucket': 'your-storage-bucket',
      })
    

    節點.js

    const admin = require('firebase-admin');
    const serviceAccount = require('/path/to/your/service_account_key.json');
    admin.initializeApp({
      credential: admin.credential.cert(serviceAccount),
      storageBucket: 'your-storage-bucket',
    });
    const ml = admin.machineLearning();
    

部署模型

TensorFlow Lite 文件

要從模型文件部署 TensorFlow Lite 模型,請將其上傳到您的項目,然後發布:

Python

# First, import and initialize the SDK as shown above.

# Load a tflite file and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_tflite_model_file('example.tflite')

# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
    display_name="example_model",  # This is the name you use from your app to load the model.
    tags=["examples"],             # Optional tags for easier management.
    model_format=tflite_format)

# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
ml.publish_model(new_model.model_id)

節點.js

// First, import and initialize the SDK as shown above.

(async () => {
  // Upload the tflite file to Cloud Storage
  const storageBucket = admin.storage().bucket('your-storage-bucket');
  const files = await storageBucket.upload('./example.tflite');

  // Create the model object and add the model to your Firebase project.
  const bucket = files[0].metadata.bucket;
  const name = files[0].metadata.name;
  const gcsUri = `gs:/⁠/${bucket}/${name}`;
  const model = await ml.createModel({
    displayName: 'example_model',  // This is the name you use from your app to load the model.
    tags: ['examples'],  // Optional tags for easier management.
    tfliteModel: { gcsTfliteUri: gcsUri },
  });

  // Publish the model.
  await ml.publishModel(model.modelId);

  process.exit();
})().catch(console.error);

TensorFlow 和 Keras 模型

使用 Python SDK,您只需一步即可將模型從 TensorFlow 保存的模型格式轉換為 TensorFlow Lite,然後將其上傳到您的 Cloud Storage 存儲分區。然後,以與部署 TensorFlow Lite 文件相同的方式部署它。

Python

# First, import and initialize the SDK as shown above.

# Convert the model to TensorFlow Lite and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_saved_model('./model_directory')

# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
    display_name="example_model",  # This is the name you use from your app to load the model.
    tags=["examples"],             # Optional tags for easier management.
    model_format=tflite_format)

# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
ml.publish_model(new_model.model_id)

如果您有 Keras 模型,您也可以將其轉換為 TensorFlow Lite 並一步上傳。您可以使用保存到 HDF5 文件的 Keras 模型:

Python

import tensorflow as tf

# Load a Keras model, convert it to TensorFlow Lite, and upload it to Cloud Storage
model = tf.keras.models.load_model('your_model.h5')
source = ml.TFLiteGCSModelSource.from_keras_model(model)

# Create the model object, add the model to your project, and publish it. (See
# above.)
# ...

或者,您可以直接從訓練腳本轉換和上傳 Keras 模型:

Python

import tensorflow as tf

# Create a simple Keras model.
x = [-1, 0, 1, 2, 3, 4]
y = [-3, -1, 1, 3, 5, 7]

model = tf.keras.models.Sequential(
    [tf.keras.layers.Dense(units=1, input_shape=[1])])
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x, y, epochs=3)

# Convert the model to TensorFlow Lite and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_keras_model(model)

# Create the model object, add the model to your project, and publish it. (See
# above.)
# ...

AutoML TensorFlow Lite 模型

如果你訓練的邊緣模型與AutoML雲API或與谷歌雲端控制台界面,您可以部署使用管理SDK的模型火力地堡。

您需要指定模型的資源標識符,它是一個類似於以下示例的字符串:

projects/PROJECT_NUMBER/locations/STORAGE_LOCATION/models/MODEL_ID
PROJECT_NUMBER包含模型的 Cloud Storage 存儲分區的項目編號。這可能是您的 Firebase 項目或其他 Google Cloud 項目。您可以在 Firebase 控制台或 Google Cloud Console 儀表板的“設置”頁面上找到此值。
STORAGE_LOCATION包含模型的 Cloud Storage 存儲分區的資源位置。此值始終為us-central1
MODEL_ID您從 AutoML Cloud API 獲得的模型 ID。

Python

# First, import and initialize the SDK as shown above.

# Get a reference to the AutoML model
source = ml.TFLiteAutoMlSource('projects/{}/locations/{}/models/{}'.format(
    # See above for information on these values.
    project_number,
    storage_location,
    model_id
))

# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
    display_name="example_model",  # This is the name you will use from your app to load the model.
    tags=["examples"],             # Optional tags for easier management.
    model_format=tflite_format)

# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
new_model.wait_for_unlocked()
ml.publish_model(new_model.model_id)

節點.js

// First, import and initialize the SDK as shown above.

(async () => {
  // Get a reference to the AutoML model. See above for information on these
  // values.
  const automlModel = `projects/${projectNumber}/locations/${storageLocation}/models/${modelId}`;

  // Create the model object and add the model to your Firebase project.
  const model = await ml.createModel({
    displayName: 'example_model',  // This is the name you use from your app to load the model.
    tags: ['examples'],  // Optional tags for easier management.
    tfliteModel: { automlModel: automlModel },
  });

  // Wait for the model to be ready.
  await model.waitForUnlocked();

  // Publish the model.
  await ml.publishModel(model.modelId);

  process.exit();
})().catch(console.error);

列出您項目的模型

您可以列出項目的模型,可選擇過濾結果:

Python

# First, import and initialize the SDK as shown above.

face_detectors = ml.list_models(list_filter="tags: face_detector").iterate_all()
print("Face detection models:")
for model in face_detectors:
  print('{} (ID: {})'.format(model.display_name, model.model_id))

節點.js

// First, import and initialize the SDK as shown above.

(async () => {
  let listOptions = {filter: 'tags: face_detector'}
  let models;
  let pageToken = null;
  do {
    if (pageToken) listOptions.pageToken = pageToken;
    ({models, pageToken} = await ml.listModels(listOptions));
    for (const model of models) {
      console.log(`${model.displayName} (ID: ${model.modelId})`);
    }
  } while (pageToken != null);

  process.exit();
})().catch(console.error);

您可以按以下字段過濾:

場地例子
display_name display_name = example_model
display_name != example_model

所有的顯示名稱與experimental_前綴:

display_name : experimental_*

請注意,僅支持前綴匹配。

tags tags: face_detector
tags: face_detector AND tags: experimental
state.published state.published = true
state.published = false

組合過濾與ANDOR ,和NOT運算符和括號( () )。

更新模型

您添加了一個模型到項目後,您可以更新其顯示名稱,標籤和tflite模型文件:

Python

# First, import and initialize the SDK as shown above.

model = ...   # Model object from create_model(), get_model(), or list_models()

# Update the model with a new tflite model. (You could also update with a
# `TFLiteAutoMlSource`)
source = ml.TFLiteGCSModelSource.from_tflite_model_file('example_v2.tflite')
model.model_format = ml.TFLiteFormat(model_source=source)

# Update the model's display name.
model.display_name = "example_model"

# Update the model's tags.
model.tags = ["examples", "new_models"]

# Add a new tag.
model.tags += "experimental"

# After you change the fields you want to update, save the model changes to
# Firebase and publish it.
updated_model = ml.update_model(model)
ml.publish_model(updated_model.model_id)

節點.js

// First, import and initialize the SDK as shown above.

(async () => {
  const model = ... // Model object from createModel(), getModel(), or listModels()

  // Upload a new tflite file to Cloud Storage.
  const files = await storageBucket.upload('./example_v2.tflite');
  const bucket = files[0].metadata.bucket;
  const name = files[0].metadata.name;

  // Update the model. Any fields you omit will be unchanged.
  await ml.updateModel(model.modelId, {
    displayName: 'example_model',  // Update the model's display name.
    tags: model.tags.concat(['new']),  // Add a tag.
    tfliteModel: {gcsTfliteUri: `gs:/⁠/${bucket}/${name}`},
  });

  process.exit();
})().catch(console.error);

取消發布或刪除模型

要取消發布或刪除模型,請將模型 ID 傳遞給取消發布或刪除方法。當您取消發布模型時,它會保留在您的項目中,但不可用於您的應用程序下載。當您刪除模型時,它會從您的項目中完全刪除。 (在標準工作流程中不希望取消發布模型,但您可以使用它立即取消發布您意外發布且尚未在任何地方使用的新模型,或者在用戶下載“壞”的情況更糟的情況下模型而不是獲得模型未找到的錯誤。)

如果您仍然沒有對 Model 對象的引用,您可能需要通過使用過濾器列出項目的模型來獲取模型 ID。例如,刪除所有標記為“face_detector”的模型:

Python

# First, import and initialize the SDK as shown above.

face_detectors = ml.list_models(list_filter="tags: 'face_detector'").iterate_all()
for model in face_detectors:
  ml.delete_model(model.model_id)

節點.js

// First, import and initialize the SDK as shown above.

(async () => {
  let listOptions = {filter: 'tags: face_detector'}
  let models;
  let pageToken = null;
  do {
    if (pageToken) listOptions.pageToken = pageToken;
    ({models, pageToken} = await ml.listModels(listOptions));
    for (const model of models) {
      await ml.deleteModel(model.modelId);
    }
  } while (pageToken != null);

  process.exit();
})().catch(console.error);