An Azure Data Factory pipeline needs to be scheduled in such a way that it executes with the deletion of a file in Azure Data Lake Storage Gen2 container.
Which kind of trigger would you prefer?
Click on the arrows to vote for the correct answer
A. B. C. D.Correct Answer: C
Event-driven architecture is a general data integration pattern that includes production, detection, consumption, and reaction to the events.
Data integration scenarios generally need Data Factory consumers to trigger pipelines depending upon the events occurring in the storage account, for example, the deletion or arrival of any file in the Blob Storage account.Option A is incorrect.
Schedule trigger is used to schedule a pipeline to run periodically i.e daily, hourly, etc.)
Option B is incorrect.
In the given scenario, an Event trigger, not an on-demand trigger, should be used.
Option C is correct.
Event trigger should be used to schedule the pipeline to execute with the deletion of the file.
Option D is incorrect.
A tumbling window is a particular type of trigger that fires at a periodic time interval from a particular start time while retaining state.
Reference:
To know more about creating event triggers, please visit the below-given link:
The best trigger to use in this scenario would be an Event Trigger (Option C).
An Event Trigger can be set up to start a pipeline when a specific event occurs, such as the deletion of a file in Azure Data Lake Storage Gen2 container. When this event occurs, the trigger will start the pipeline and execute the activities within it.
In contrast, a Schedule Trigger (Option A) is used to run a pipeline on a predefined schedule, such as every day at a specific time. It does not react to external events, such as the deletion of a file.
An On-demand Trigger (Option B) is used to manually start a pipeline when needed, rather than on a schedule or based on an event. This trigger is not suitable for this scenario, as it requires manual intervention.
A Tumbling Window (Option D) is not a type of trigger, but rather a way to group data in a time-based manner. It is used for windowing functions in data processing, and is not relevant to this scenario.
Therefore, an Event Trigger is the best option for scheduling an Azure Data Factory pipeline to execute with the deletion of a file in Azure Data Lake Storage Gen2 container.