Remote Config を使用すると、コードに値をハードコードするのではなく、クラウド内の生成 AI 機能の重要な構成を制御できます。つまり、アプリの新しいバージョンをリリースせずに構成を更新できます。Remote Config ではさまざまなことができますが、生成 AI 機能でリモート制御することをおすすめする主な値は次のとおりです。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["必要な情報がない","missingTheInformationINeed","thumb-down"],["複雑すぎる / 手順が多すぎる","tooComplicatedTooManySteps","thumb-down"],["最新ではない","outOfDate","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["サンプル / コードに問題がある","samplesCodeIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-09-02 UTC。"],[],[],null,["\u003cbr /\u003e\n\nWhen you're ready to launch your app and have real end users interact with your\ngenerative AI features, make sure to review this checklist of best practices and\nimportant considerations.\n| You can complete many of these checklist items as soon as you start to seriously develop your app and well before launch. \n| **Most importantly, you should enable\n| [Firebase App Check](/docs/ai-logic/app-check)\n| to help secure your app and configure\n| [Firebase Remote Config](/docs/ai-logic/solutions/remote-config)\n| to allow on-demand changes to AI parameters (like model name) without an app\n| update.**\n\nGeneral\n\nReview the general launch checklist for apps that use Firebase\n\nThis [Firebase launch checklist](/support/guides/launch-checklist) describes\nimportant best practices before launching any Firebase app to production.\n\nMake sure your Firebase projects follow best practices\n\nFor example, make sure that you use different Firebase projects for development,\ntesting, and production. Review more best practices for\n[managing your projects](/support/guides/launch-checklist#projects-follow-best-practices).\n\nAccess and security\n\nReview the general security checklist for apps that use Firebase\n\nThis [security checklist](/support/guides/security-checklist) describes\nimportant best practices for access and security for Firebase apps and services.\n\nStart *enforcing* Firebase App Check\n\n[Firebase App Check](/docs/ai-logic/app-check) helps protect the APIs that\ngive you access to the Gemini and Imagen models.\nApp Check verifies that requests are from your actual app and an authentic,\nuntampered device. It supports attestation providers for\nApple platforms (DeviceCheck or App Attest), Android (Play Integrity), and\nWeb (reCAPTCHA Enterprise), and it supports all these providers for Flutter and\nUnity apps, as well.\n\nAlso, to\n[prepare for upcoming enhanced protection from App Check](/docs/ai-logic/app-check#enhanced-protection)\nthrough *replay protection*, we recommend enabling the usage of\nlimited-use tokens in your apps.\n\nSet up restrictions for your Firebase API keys\n\n- Review each Firebase API key's\n [\"API restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_api_restrictions)\n allowlist:\n\n - Make sure that the Firebase AI Logic API is in the\n allowlist.\n\n - Make sure that the only other APIs in the key's allowlist are for Firebase\n services that you use in your app. See the\n [list of which APIs are required to be on the allowlist for each product](/docs/projects/api-keys#faq-required-apis-for-restricted-firebase-api-key).\n\n- Set\n [\"Application restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_application_restrictions)\n to help restrict usage of each Firebase API key to only requests from your app\n (for example, a matching bundle ID for the Apple app). Note that even if you\n restrict your key, Firebase App Check is still strongly recommended.\n\nNote that Firebase-related APIs use API keys only to *identify* the Firebase\nproject or app, *not for authorization* to call the API.\n\nBilling, monitoring, and quota\n\nAvoid surprise bills\n\nIf your Firebase project is on the pay-as-you-go Blaze pricing plan, then\n[monitor your usage](/docs/ai-logic/monitoring) and\n[set up budget alerts](/docs/projects/billing/avoid-surprise-bills#set-up-budget-alert-emails).\n\nSet up AI monitoring in the Firebase console\n\n[Set up AI monitoring](/docs/ai-logic/monitoring#ai-monitoring-in-console) to\ngain visibility into key performance metrics, like requests, latency, errors,\nand token usage. AI monitoring also helps you inspect and debug your\nFirebase AI Logic features by surfacing individual traces.\n\nReview your quotas for the required underlying APIs\n\n- Make sure that you\n [understand the quotas for each required API](/docs/ai-logic/quotas#understand-quotas).\n\n- [Set rate limits per user](/docs/ai-logic/quotas#understand-quotas-vertexai-in-firebase)\n (the default is 100 RPM).\n\n- [Edit quota or request a quota increase](/docs/ai-logic/quotas#edit-quota-or-request-quota-increase),\n as needed.\n\nManagement of configurations\n\nUse a stable model version in your production app\n\nIn your production app, only use\n[*stable* model versions](/docs/ai-logic/models#versions) (like\n`gemini-2.0-flash-001`), not a *preview* or *experimental* version or\nan *auto-updated* alias.\n\nEven though an *auto-updated* stable alias points to a stable version, the\nactual model version it points to will automatically change whenever a new\nstable version is released, which could mean unexpected behavior or responses.\nAlso, *preview* and *experimental* versions are only recommended during\nprototyping.\n| **Important:** We strongly recommend using [Firebase Remote Config](/docs/ai-logic/solutions/remote-config) to control and update the model name used in your app (see the next section).\n\nSet up and use Firebase Remote Config\n\nWith [Remote Config](/docs/ai-logic/solutions/remote-config),\nyou can control important configurations for your generative AI feature\n*in the cloud* rather than hard-coding values in your\ncode. This means that you can update your configuration without releasing\na new version of your app. You can do a lot with Remote Config, but here\nare the top values that we recommend you control remotely for your generative\nAI feature:\n\n- Keep your app up-to-date.\n\n - **Model name**: Update the model your app uses as new models are released or others are discontinued.\n- Adjust values and inputs based on client attributes, or to accommodate\n feedback from testing or users.\n\n - **Model configuration**: Adjust the temperature, max output tokens, and\n more.\n\n - **Safety settings**: Adjust safety settings if too many responses are\n getting blocked or if users report harmful responses.\n\n - **System instructions** and **any prompts that you provide**: Adjust the\n additional context that you're sending to the model to steer its\n responses and behavior. For example, you might want to tailor prompts for\n specific client types, or personalize prompts for new users that differ from\n those used to generate responses for existing users.\n\nYou could also optionally set a `minimum_version` parameter in Remote Config\nto compare the app's current version with the Remote Config-defined latest\nversion, to either show an upgrade notification to users or force users to\nupgrade.\n\nSet the location for accessing the model\n\n\n|----------------------------------------------------------------------------|\n| *Only available when using the Vertex AI Gemini API as your API provider.* |\n\n\u003cbr /\u003e\n\n[Setting a location for accessing the model](/docs/ai-logic/locations) can help\nwith costs as well as help prevent latency for your users.\n\nIf you don't specify a location, the default is `us-central1`. You can set this\nlocation during initialization, or you can optionally\n[use Firebase Remote Config to dynamically change the location based on each user's location](/docs/ai-logic/solutions/remote-config)."]]