Migrating Data Backup and Disaster Recovery Solutions to GCP

Migrating Data Backup and Disaster Recovery Solutions to GCP

Question

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP)

The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP.

The organization's on-premises production environment is going to be the next phase for migration to GCP.

Stable networking connectivity between the on-premises environment and GCP is also being implemented.

Which GCP solution should the organization use?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

B.

https://cloud.google.com/solutions/migration-to-google-cloud-building-your-foundation

Option A: BigQuery using a data pipeline job with continuous updates via Cloud VPN BigQuery is a data warehousing and analytics solution offered by Google Cloud Platform (GCP). This option suggests using BigQuery to move the data from on-premises to GCP through a data pipeline job with continuous updates via Cloud VPN. A VPN connection is created between the on-premises environment and GCP to ensure secure data transfer. However, BigQuery is typically used for analytics workloads rather than data backup and disaster recovery, and this option may not be the best fit for the organization's needs.

Option B: Cloud Storage using a scheduled task and gsutil via Cloud Interconnect Cloud Storage is a scalable and durable object storage service offered by GCP. This option suggests using Cloud Storage to store backups and disaster recovery data, using a scheduled task and gsutil via Cloud Interconnect to transfer data securely between the on-premises environment and GCP. Cloud Interconnect provides dedicated and reliable connectivity between the on-premises environment and GCP. This option may be a good fit for the organization's needs as it provides a scalable and reliable storage solution with secure data transfer.

Option C: Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect Compute Engine is a virtual machine (VM) service offered by GCP. This option suggests using Compute Engines Virtual Machines with Persistent Disk via Cloud Interconnect to move the on-premises production environment to GCP. Cloud Interconnect provides dedicated and reliable connectivity between the on-premises environment and GCP, and Persistent Disk offers durable and high-performance block storage. This option may be a good fit for the organization's needs as it provides a scalable and reliable infrastructure for the production environment.

Option D: Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN Cloud Datastore is a NoSQL document database service offered by GCP. This option suggests using Cloud Datastore with regularly scheduled batch upload jobs via Cloud VPN to move the backups and disaster recovery data to GCP. However, Cloud Datastore may not be the best fit for data backup and disaster recovery needs, as it is typically used for real-time applications that require low-latency access to data.

Overall, the best solution for the organization's needs would be Option B: Cloud Storage using a scheduled task and gsutil via Cloud Interconnect. This option provides a scalable and reliable storage solution with secure data transfer, which is ideal for data backup and disaster recovery.