Azure Data Factory Pipeline Execution Logging Levels

Data Flow Activities Logging Levels - DP-203 Microsoft Exam

Question

Jamie is building an ETL data pipeline using Azure Data factory where he manages the mapped data flow jobs with Synapse data flows and databricks notebook with a lookup activity.

He's required to configure the logging level for the pipeline execution of the data flow activities to collect telemetry logs.

Which are the following three logging level options available for Azure Data Factory pipeline execution of the data flow activities?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D. E.

Correct Answers: B, D and E

Verbose

Basic

None.

Sure, I can provide you with a detailed explanation of the logging levels available for Azure Data Factory pipeline execution of the data flow activities.

When you create and run an ETL data pipeline using Azure Data Factory, you can configure the logging level for pipeline execution of the data flow activities to collect telemetry logs. The logging level determines the amount of information that is collected in the logs.

There are three logging level options available for Azure Data Factory pipeline execution of the data flow activities:

  1. Information: This logging level provides basic information about the execution of the data flow activities. It includes information such as the start and end time of each activity, the status of each activity (e.g., success or failure), and any error messages that are generated.

  2. Verbose: This logging level provides more detailed information about the execution of the data flow activities. It includes all of the information provided at the Information logging level, as well as additional information such as the input and output data for each activity, the execution plan for each activity, and the time taken to execute each activity.

  3. Premium: This logging level provides the highest level of detail about the execution of the data flow activities. It includes all of the information provided at the Verbose logging level, as well as additional information such as detailed diagnostic information, performance metrics, and resource utilization information.

In addition to these logging levels, there are also options to disable logging (None) or to use the default logging level (Basic) which logs only the start and end time of each activity and the status of each activity.

It's important to note that increasing the logging level can have an impact on the performance and cost of your data pipeline. Therefore, you should choose the logging level that provides the necessary information for your monitoring and debugging needs, while also considering the impact on performance and cost.