Configure Diagnostic Logs for Azure Databricks

Storing Diagnostic Logs for Databricks

Question

Phil is a Data Analyst of Adatum Corp.

who's working on setting up diagnostic logs for Azure Databricks.

He's using the Azure portal to configure the diagnostic logs setting for the Databricks cluster.

Which of the following three Azure resources can be used for storing Diagnostic logs for Databricks?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D. E. F. G.

Correct Answers: A, C and F.

Phil, the Data Analyst of Adatum Corp, can use three Azure resources for storing Diagnostic logs for Azure Databricks:

  1. Storage Account: This is the most common Azure resource used for storing diagnostic logs. A storage account is a secure and scalable cloud storage solution that enables you to store large amounts of unstructured data such as text or binary data. Azure Databricks can be configured to store diagnostic logs in a storage account.

  2. Azure Log Analytics: This is a service that allows you to collect, analyze, and visualize data from different sources. Azure Databricks can be configured to send diagnostic logs to Azure Log Analytics. The data can be queried, analyzed, and visualized in near real-time using Azure Monitor.

  3. Azure Data Lake Storage (ADLS) gen2 storage: This is a scalable and secure cloud storage solution that allows you to store and analyze large amounts of data in Azure. Azure Databricks can be configured to store diagnostic logs in ADLS gen2 storage. The data can be processed using Azure Data Factory or other big data processing solutions.

The other options listed, such as Event Hub, Event Grid, Azure Service Bus Queue, and Azure Cosmos DB, are not typically used for storing diagnostic logs for Azure Databricks. These services are more commonly used for event-driven architectures, messaging, and data processing scenarios.