[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["缺少我需要的資訊","missingTheInformationINeed","thumb-down"],["過於複雜/步驟過多","tooComplicatedTooManySteps","thumb-down"],["過時","outOfDate","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["示例/程式碼問題","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-05 (世界標準時間)。"],[],[],null,["\u003cbr /\u003e\n\n[Blaze](/pricing) plan users can set up their Firebase Realtime Database for\nautomatic backups, a self-service feature that enables daily backups of your\nDatabase application data and [rules](/database/security) in JSON format to a\n[Cloud Storage](//cloud.google.com/storage/docs/) bucket.\n\nSetup\n\nTo get started, visit the [Backups\ntab](//console.firebase.google.com/project/_/database/backups) in\nthe Database section of the Firebase console, and the wizard will guide you\nthrough setting up your automated backups.\n\nTo save on storage costs, we enable [Gzip](/docs/database/backups#gzip_compression)\ncompression by default, and you can choose to enable a\n[30-day lifecycle policy](/docs/database/backups#storage_lifecycle)\non your bucket to have backups older than 30 days automatically deleted.\n\nYou can view the status and backup activity directly in the Firebase console\nwhere you can also start a manual backup. This can be useful for taking specific\ntimed snapshots or as a safety action before you perform any code changes.\n\nOnce set up, a new Cloud Storage bucket will be created for you with the\n[WRITER permission](//cloud.google.com/storage/docs/access-control/lists#permissions)\nfor Firebase. You should not store data in this bucket you are not comfortable\nwith Firebase having access to. Firebase will have no additional access to your\nother Cloud Storage buckets or any other areas of Google Cloud.\n\nRestoring from backups\n\nTo restore your Firebase from a backup, first download the file from\nCloud Storage to your local disk. This can be done by clicking the filename\nwithin the backup activity section or from the Cloud Storage bucket\ninterface. If the file is Gzip compressed, first\n[decompress](/docs/database/backups#gzip_compression) the file.\n\nThere are two ways you can import your data:\n\nMethod 1: Click the Import JSON button in your\n[Database's Data section](//console.firebase.google.com/project/_/database/data)\nand select your application data JSON file.\n\nMethod 2: You can also issue a CURL request from your command line.\n\nFirst retrieve a secret from your Firebase, which you can get by visiting\nthe [Database settings page](//console.firebase.google.com/project/_/settings/database).\n\nThen enter the following into your terminal, replacing the `DATABASE_NAME`\nand `SECRET` fields with your own values: \n\n curl 'https://\u003cDATABASE_NAME\u003e.firebaseio.com/.json?auth=\u003cSECRET\u003e&print=silent' -X PUT -d @\u003cDATABASE_NAME\u003e.json\n\nIf you are having trouble restoring a backup from a very large database, please\nreach out to our [support team](/support/contact/troubleshooting).\n\nScheduling\n\nYour Database backup is assigned to a specific hour each day that ensures even\nload and highest availability for all backup customers. This scheduled backup\nwill occur regardless of if you do any manual backups throughout the day.\n\nFile naming\n\nFiles transferred to your Cloud Storage bucket will be timestamped\n(ISO 8601 standard) and use the following naming conventions:\n\n- Database data: `YYYY-MM-DDTHH:MM:SSZ_\u003cDATABASE_NAME\u003e_data.json`\n- Database rules: `YYYY-MM-DDTHH:MM:SSZ_\u003cDATABASE_NAME\u003e_rules.json`\n\nIf [Gzip](/docs/database/backups#gzip_compression)\nis enabled, a `.gz` suffix will be appended to the filenames. You can\neasily find the backups from a specific date or time using Cloud Storage\nprefix searching.\n\nGzip compression\n\nBy default, we compress your backup files using Gzip compression to save on\nstorage costs and decrease transfer times. The compressed filesize varies\ndepending on the data characteristics of your Database, but typical Databases\nmay shrink to ⅓ their original size, saving you on storage costs and decreasing\nthe upload time for your backups.\n\nTo decompress your Gzipped JSON files, issue a command line command using the\n`gunzip` binary which is shipped by default for OS-X and most Linux\ndistributions. \n\n gunzip \u003cDATABASE_NAME\u003e.json.gz # Will unzip to \u003cDATABASE_NAME\u003e.json\n\nStorage 30 day lifecycle\n\nWe offer an easy to use configuration switch that enables a default 30 day\nobject lifecycle policy for your Cloud Storage bucket. When enabled, files in\nyour bucket will be automatically deleted after 30 days. This helps to reduce\nunwanted old backups, saving you on storage costs, and keeping your bucket\ndirectory clean. If you place other files into your Automated Backups bucket,\nthey will also be deleted with the same policy.\n\nCosts\n\nThe backups feature can be enabled for projects on the [Blaze](/pricing) plan\nfor no additional cost. However, you will be charged at the [standard\nrates](//cloud.google.com/storage/pricing) for the backup files\nplaced in your Cloud Storage bucket. You can enable [Gzip\nCompression](/docs/database/backups#gzip_compression) and [Storage 30 day\nLifecycle](/docs/database/backups#storage_lifecycle) to reduce your storage\ncosts."]]