The latest Gemini models, like Gemini 3.1 Flash Image (Nano Banana 2), are available to use with Firebase AI Logic on all platforms!
Gemini 2.0 Flash and Flash-Lite models will be retired on June 1, 2026. To avoid service disruption, update to a newer model like gemini-2.5-flash-lite. Also, Gemini 3 Pro Preview (gemini-3-pro-preview) will be retired on March 9, 2026 (update to Gemini 3.1 Pro Preview: gemini-3.1-pro-preview). Learn more.
Build hybrid and on-device experiences with Firebase AI Logic
Stay organized with collections
Save and categorize content based on your preferences.
You can build AI-powered Android and Web apps and features with hybrid inference
using Firebase AI Logic. Hybrid inference enables running inference using
on-device models when available and seamlessly falling back to
cloud-hosted models otherwise (and vice versa).
Using an on-device model for inference offers:
Enhanced privacy
Local context
Inference at no-cost
Offline functionality
Using hybrid functionality offers:
Reach more of your audience by accommodating on-device model availability
and internet connectivity
Follow our get started guides
These guides provide step-by-step instructions to set up hybrid experiences
in your apps.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2026-02-27 UTC."],[],[]]