Accessing and Analyzing Google Cloud Platform Logs | Best Practices

Obtaining Combined Logs for Multiple GCP Projects

Question

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days.

You want to be able to explore and quickly analyze the log contents.

You want to follow Google-recommended practices to obtain the combined logs for all projects.

What should you do?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

B.

https://cloud.google.com/blog/products/gcp/best-practices-for-working-with-google-cloud-audit-logging

The correct answer for this question is B. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.

Explanation: To manage several Google Cloud Platform (GCP) projects and access all logs for the past 60 days, it is recommended to use Stackdriver Logging. Stackdriver Logging is a fully managed logging service that provides access to logs from Google Cloud Platform (GCP) resources, third-party services, and applications.

Option A is not the best solution because navigating to Stackdriver Logging and selecting resource.labels.project_id="*" will only display the logs for one project at a time. Since the question asks for the logs of multiple projects, this approach would be inefficient.

Option C is not the best solution because creating a Stackdriver Logging Export with a Sink destination to Cloud Storage and then creating a lifecycle rule to delete objects after 60 days would mean that you would only have access to the logs for 60 days. This would not allow you to explore and analyze logs from a longer period.

Option D is not the best solution because configuring a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery with table expiration to 60 days would mean that the logs would be deleted after 60 days. This would not allow you to explore and analyze logs from a longer period.

Option B is the best solution because it involves creating a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. BigQuery is a fully-managed, serverless data warehouse that enables super-fast SQL queries using the processing power of Google's infrastructure. This means that you can quickly and easily explore and analyze the log contents. Configuring the table expiration to 60 days ensures that you have access to the logs for the past 60 days. Additionally, exporting logs to BigQuery allows for cross-project log analysis, which is in line with Google-recommended practices for obtaining combined logs for all projects.