Add Recommendations to your app with TensorFlow Lite and Firebase - iOS Codelab

Welcome to the Recommendations with TensorFlow Lite and Firebase codelab. In this codelab you'll learn how to use TensorFlow Lite and Firebase to deploy a recommendation model to your app. This codelab is based on this TensorFlow Lite example.

Recommendations allow apps to use machine learning to intelligently serve the most relevant content for each user. They take into account past user behavior to suggest app's content the user might like to interact with in the future by using a model trained on the aggregate behavior of a large number of other users.

This tutorial shows how to obtain data from your app's users with Firebase Analytics, build a machine learning model for recommendations from that data, and then use that model in an iOS app to run inference and obtain recommendations. In particular, our recommendations will suggest which movies a user would most likely watch given the list of movies the user has liked previously.

What you'll learn

  • Integrate Firebase Analytics into an android app to collect user behavior data
  • Export that data into Google Big Query
  • Pre-process the data and train a TF Lite recommendations model
  • Deploy the TF Lite model to Firebase ML and access it from your app
  • Run on device inference using the model to suggest recommendations to users

What you'll need

  • Xcode 11 (or higher)
  • CocoaPods 1.9.1 (or higher)

How will you use this tutorial?

Read it through only Read it and complete the exercises

How would rate your experience with building iOS apps?

Novice Intermediate Proficient

Add Firebase to the project

  1. Go to the Firebase console.
  2. Select Create New Project and name your project "Firebase ML iOS Codelab".

Download the Code

Begin by cloning the sample project and running pod update in the project directory:

git clone https://github.com/FirebaseExtended/codelab-contentrecommendations-ios.git
cd codelab-contentrecommendations-ios/start
pod install --repo-update

If you don't have git installed, you can also download the sample project from its GitHub page or by clicking on this link. Once you've downloaded the project, run it in Xcode and play around with the recommendation to get a feel for how it works.

Set up Firebase

Follow the documentation to create a new Firebase project. Once you've got your project, download your project's GoogleService-Info.plist file from Firebase console and drag it to the root of the Xcode project.

4a923d5c7ae0d8f3.png

Add Firebase to your Podfile and run pod install.

pod 'Firebase/Analytics'
pod 'Firebase/MLCommon'
pod 'Firebase/MLModelInterpreter'

In your AppDelegate's didFinishLaunchingWithOptions method, import Firebase at the top of the file

import Firebase

And add a call to configure Firebase.

FirebaseApp.configure()

Run the project again to make sure the app is configured correctly and does not crash on launch.

  1. Ensure that "Enable Google Analytics for this project" is enabled.
  2. Follow the remaining setup steps in the Firebase console, then click Create project (or Add Firebase, if you're using an existing Google project).

In this step, you will add Firebase Analytics to the app to log user behavior data (in this case, which movies a user likes). This data will be used in aggregate in future steps to train the recommendations model.

Set up Firebase Analytics in the app

The LikedMoviesViewModel contains functions to store the movies the user likes. Every time the user likes a new movie, we want to also send off an analytics log event to record that like.

Add the code below to register an analytics event when the user clicks like on a movie.

AllMoviesCollectionViewController.swift

import Firebase
//


override func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
//

if movie.liked == nil {
      movie.liked = true
      Analytics.logEvent(AnalyticsEventSelectItem, parameters: [AnalyticsParameterItemID: movie.id])
    } else {
      movie.liked?.toggle()
    }
       
}

In this step, we will generate Analytics events in the app and verify that they are being sent to the Firebase Console.

Enable Analytics Debug Logging

Generally, events logged by your app are batched together over the period of approximately one hour and uploaded together. This approach conserves the battery on end users' devices and reduces network data usage. However, for the purposes of validating your analytics implementation (and, in order to view your analytics in the DebugView report), you can enable Debug mode on your development device to upload events with a minimal delay.

To enable Analytics Debug mode on your development device, specify the following command line argument in Xcode:

-FIRDebugEnabled

At this point, you have successfully integrated Firebase Analytics into your app. As users use your app and like movies, their likes will be logged in aggregate. We will use this aggregate data in the rest of this codelab to train our recommendations model. The following is an optional step to see the same Analytics events you saw in Logcat also stream into the Firebase console. Feel free to skip to the next page.

Optional: Confirm Analytics events in Firebase Console

  1. Go to the Firebase console.
  2. Select DebugView under Analytics
  3. In Xcode, select Run to launch the app and add some movies to your Liked list.
  4. In the Firebase console's DebugView, verify that these events are being logged as you add movies in the app.

Big Query is a Google Cloud product that allows you to examine and process large amounts of data. In this step, you will connect your Firebase Console project to Big Query so that the Analytics data generated by your app is automatically exported to Big Query.

Enable Big Query export

  1. Go to the Firebase console.
  2. Select the Settings gear icon next to Project Overview, and then select Project settings
  3. Select the Integrations tab.
  4. Select Link (or Manage) inside the BigQuery block.
  5. Select Next in the About Linking Firebase to BigQuery step.
  6. Under the Configure integration section, click the switch to enable sending Google Analytics data and select Link to BigQuery.

You have now enabled your Firebase console project to automatically send Firebase Analytics event data to Big Query. This happens automatically without any further interaction, however, the first export that creates the analytics dataset in BigQuery may not happen for 24 hours. After the dataset is created, Firebase continually exports new Analytics events to Big Query into the intraday table, and groups events from past days in the events table.

Training a recommendations model requires a lot of data. Since we don't already have an app generating large amounts of data, in the next step we will import a sample dataset into BigQuery to use for the rest of this tutorial.

Now that we have connected our Firebase Console to export to BigQuery, our app analytics event data will automatically show up in the BigQuery console after some time. To get some initial data for the purposes of this tutorial, in this step we will import an existing sample dataset into your BigQuery console to use to train our recommendations model.

Import sample dataset into BigQuery

  1. Go to the BigQuery dashboard in the Google cloud console.
  2. Select your project name in the menu.
  3. Select your project name in the bottom of the BigQuery left navigation to see details.
  4. Select Create dataset to open the dataset creation panel.
  5. Enter ‘firebase_recommendations_dataset' for the Dataset ID and select Create dataset.
  6. The new dataset will show up in the left menu under the project name. Click it.
  7. Select Create table to open the table creation panel.
  8. For Create table from select ‘Google Cloud Storage'.
  9. In the Select file from GCS bucket field, enter ‘gs://firebase-recommendations/recommendations-test/formatted_data_filtered.txt'.
  10. Select ‘JSONL' in the File format drop down.
  11. Enter ‘recommendations_table' for the Table name.
  12. Check the box under Schema > Auto detect > Schema and input parameters
  13. Select Create table

Explore sample dataset

At this point, you can optionally explore the schema and preview this dataset.

  1. Select firebase-recommendations-dataset in the left menu to expand the tables it contains.
  2. Select the recommendations-table table to view the table schema.
  3. Select Preview to see the actual Analytics event data this table contains.

Create service account credentials

Now, we will create service account credentials in our Google Cloud console project that we can use in the Colab environment in the following step to access and load our BigQuery data.

  1. Make sure that billing is enabled for your Google Cloud project.
  2. Enable the BigQuery and BigQuery Storage API APIs. < click here>
  3. Go to the Create Service Account Key page.
  4. From the Service account list, select New service account.
  5. In the Service account name field, enter a name.
  6. From the Role list, select Project > Owner.
  7. Click Create. A JSON file that contains your key downloads to your computer.

In the next step, we will use Google Colab to preprocess this data and train our recommendations model.

In this step, we will use a Colab notebook to perform the following steps:

  1. import the BigQuery data into the Colab notebook
  2. preprocess the data to prepare it for model training
  3. train the recommendations model on the analytics data
  4. export the model as a TF lite model
  5. deploy the model to the Firebase Console so we can use it in our app

Before we launch the Colab training notebook, we will first enable the Firebase Model Management API so Colab can deploy the trained model to our Firebase console.

Enable Firebase Model Management API

Create a bucket to store your ML models

In your Firebase Console, go to Storage and click Get started. fbbea78f0eb3dc9f.png

Follow the dialogue to get your bucket set up.

19517c0d6d2aa14d.png

Enable Firebase ML API

Go to Firebase ML API page on Google Cloud Console and click Enable.

Use Colab notebook to train and deploy the model

Open the colab notebook using the following link and complete the steps within. After finishing the steps in the Colab notebook, you will have a TF lite model file deployed to the Firebase console that we can sync down to our app.

Open in Colab

In this step, we'll modify our app to download the model we just trained from Firebase Machine Learning.

Add Firebase ML dependency

The following dependency is needed in order to use Firebase Machine Learning models in your app. It should already be added (verify).

Podfile

import Firebase

Download the model with Firebase Model Manager API

Copy the code below into ModelDownloader.swift to set up the conditions under which model download occurs and create a download task to sync the remote model to our app.

ModelDownloader.swift

  static func downloadModel(named name: String,
                            completion: @escaping (RemoteModel?,         DownloadError?) -> Void) {
    guard FirebaseApp.app() != nil else {
      completion(nil, .firebaseNotInitialized)
      return
    }
    guard success == nil && failure == nil else {
      completion(nil, .downloadInProgress)
      return
    }

    let remoteModel = CustomRemoteModel(name: name)
    let conditions = ModelDownloadConditions(allowsCellularAccess: true,
                                             allowsBackgroundDownloading: true)

    success = NotificationCenter.default.addObserver(forName: .firebaseMLModelDownloadDidSucceed,
                                                     object: nil,
                                                     queue: nil) { (notification) in
      defer { success = nil; failure = nil }
      guard let userInfo = notification.userInfo,
          let model = userInfo[ModelDownloadUserInfoKey.remoteModel.rawValue] as? RemoteModel
      else {
        completion(nil, .downloadReturnedEmptyModel)
        return
      }
      guard model.name == name else {
        completion(nil, .downloadReturnedWrongModel)
        return
      }
      completion(model, nil)
    }
    failure = NotificationCenter.default.addObserver(forName: .firebaseMLModelDownloadDidFail,
                                                     object: nil,
                                                     queue: nil) { (notification) in
      defer { success = nil; failure = nil }
      guard let userInfo = notification.userInfo,
          let error = userInfo[ModelDownloadUserInfoKey.error.rawValue] as? Error
      else {
        completion(nil, .mlkitError(underlyingError: DownloadError.unknownError))
        return
      }
      completion(nil, .mlkitError(underlyingError: error))
    }
    ModelManager.modelManager().download(remoteModel, conditions: conditions)
  }

ModelDownloader.swift

  static func fetchModel(named name: String,
                         completion: @escaping (String?, DownloadError?) -> Void) {
    let remoteModel = CustomRemoteModel(name: name)
    if ModelManager.modelManager().isModelDownloaded(remoteModel) {
      ModelManager.modelManager().getLatestModelFilePath(remoteModel) { (path, error) in
        completion(path, error.map { DownloadError.mlkitError(underlyingError: $0) })
      }
    } else {
      downloadModel(named: name) { (model, error) in
        guard let model = model else {
          let underlyingError = error ?? DownloadError.unknownError
          let compositeError = DownloadError.mlkitError(underlyingError: underlyingError)
          completion(nil, compositeError)
          return
        }
        ModelManager.modelManager().getLatestModelFilePath(model) { (path, pathError) in
          completion(path, error.map { DownloadError.mlkitError(underlyingError: $0) })
        }
      }
    }
  }

Tensorflow Lite runtime will let you use your model in the app to generate recommendations. In the previous step we initialized a TFlite interpreter with the model file we downloaded. In this step, we'll first load a dictionary and labels to accompany our model in the inference step, then we'll add pre-processing to generate the inputs to our model and post-processing where we will extract the results from our inference.

Load Dictionary and Labels

The labels used to generate the recommendation candidates by the recommendations model are listed in the file sorted_movie_vocab.json in the assets folder. Copy the following code to load these candidates.

RecommendationsViewController.swift

  func getMovies() -> [MovieItem] {
    let barController = self.tabBarController as! TabBarController
    return barController.movies
  }

Implement Pre-processing

In the pre-processing step, we change the form of the input data to match what our model expects. Here, we pad the input length with a placeholder value if we have not generated a lot of user likes already. Copy the code below:

RecommendationsViewController.swift

  // Given a list of selected items, preprocess to get tflite input.
  func preProcess() -> Data {
    let likedMovies = getLikedMovies().map { (MovieItem) -> Int32 in
      return MovieItem.id
    }
    var inputData = Data(copyingBufferOf: Array(likedMovies.prefix(10)))

    // Pad input data to have a minimum of 10 context items (4 bytes each)
    while inputData.count < 10*4 {
      inputData.append(0)
    }
    return inputData
  }

Run interpreter to generate recommendations

Here we use the model we downloaded in a previous step to run inference on our pre-processed input. We set the type of input and output for our model and run inference to generate our movie recommendations. Copy the following code into your app.

RecommendationsViewController.swift

import TensorFlowLite

RecommendationsViewController.swift

 private var interpreter: Interpreter?

 func loadModel() {
    // Download the model from Firebase
    print("Fetching recommendations model...")
    ModelDownloader.fetchModel(named: "recommendations") { (filePath, error) in
      guard let path = filePath else {
        if let error = error {
          print(error)
        }
        return
      }
      print("Recommendations model download complete")
      self.loadInterpreter(path: path)
    }
  }

 func loadInterpreter(path: String) {
    do {
      interpreter = try Interpreter(modelPath: path)

      // Allocate memory for the model's input `Tensor`s.
      try interpreter?.allocateTensors()

      let inputData = preProcess()

      // Copy the input data to the input `Tensor`.
      try self.interpreter?.copy(inputData, toInputAt: 0)

      // Run inference by invoking the `Interpreter`.
      try self.interpreter?.invoke()

      // Get the output `Tensor`
      let confidenceOutputTensor = try self.interpreter?.output(at: 0)
      let idOutputTensor = try self.interpreter?.output(at: 1)

      // Copy output to `Data` to process the inference results.
      let confidenceOutputSize = confidenceOutputTensor?.shape.dimensions.reduce(1, {x, y in x * y})

      let idOutputSize = idOutputTensor?.shape.dimensions.reduce(1, {x, y in x * y})

      let confidenceResults =
        UnsafeMutableBufferPointer<Float32>.allocate(capacity: confidenceOutputSize!)
      let idResults =
        UnsafeMutableBufferPointer<Int32>.allocate(capacity: idOutputSize!)
      _ = confidenceOutputTensor?.data.copyBytes(to: confidenceResults)
      _ = idOutputTensor?.data.copyBytes(to: idResults)

      postProcess(idResults, confidenceResults)

      print("Successfully ran inference")
      DispatchQueue.main.async {
        self.tableView.reloadData()
      }
    } catch {
      print("Error occurred creating model interpreter: \(error)")
    }
  }

Implement Post-processing

Finally, in this step we post-process the output from our model, selecting the results with the highest confidence and removing contained values (movies the user has already liked). Copy the following code into your app.

RecommendationsViewController.swift

  // Postprocess to get results from tflite inference.
  func postProcess(_ idResults: UnsafeMutableBufferPointer<Int32>, _ confidenceResults: UnsafeMutableBufferPointer<Float32>) {
    for i in 0..<10 {
      let id = idResults[i]
      let movieIdx = getMovies().firstIndex { $0.id == id }
      let title = getMovies()[movieIdx!].title
      recommendations.append(Recommendation(title: title, confidence: confidenceResults[i]))
    }
  }

Test your app!

Re-run your app. As you select a few movies, it should automatically download the new model and start generating recommendations!

You have built a recommendations feature into your app using TensorFlow Lite and Firebase. Note that the techniques and pipeline shown in this codelab can be generalized and used to serve other types of recommendations as well.

What we've covered

  • Firebase ML
  • Firebase Analytics
  • Export analytics events to BigQuery
  • Preprocess analytics events
  • Train recommendations TensorFlow model
  • Export model and deploy to Firebase Console
  • Serve movie recommendations in an app

Next Steps

  • Implement Firebase ML recommendations in your app.

Learn More

Have a Question?

Report Issues