Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account.
You plan to implement changes to a data storage solution to meet regulatory and compliance standards.
Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You schedule an Azure Data Factory pipeline.
Does this meet the goal?
Click on the arrows to vote for the correct answer
A. B.B
Instead you can use the Delete Activity in Azure Data Factory to delete files or folders from on-premises storage stores or cloud storage stores or apply an Azure
Blob storage lifecycle policy.
https://docs.microsoft.com/en-us/azure/data-factory/delete-activity https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts?tabs=azure-portalThe solution of scheduling an Azure Data Factory pipeline would meet the goal of identifying and deleting blobs that were not modified during the last 100 days.
Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows to move, transform, and process data between on-premises and cloud-based data stores. It provides a range of connectors for various data sources and destinations, including Azure Storage.
By creating a pipeline in Azure Data Factory, you can specify the source and destination data stores, and define the activities to perform on the data, such as copy, transform, and delete. In this case, you can use the Delete activity to delete blobs that meet a certain condition, such as those that were not modified during the last 100 days.
To implement this solution, you can create a pipeline that includes the following steps:
Use the Get Metadata activity to retrieve the metadata of the blobs in the Azure Storage account. This activity can provide information about the blob's last modified timestamp, which you can use to filter the blobs that were not modified during the last 100 days.
Use the Filter activity to select the blobs that were not modified during the last 100 days. You can use an expression to compare the last modified timestamp with the current date and time, and select the blobs that meet the condition.
Use the Delete activity to delete the selected blobs. This activity can delete the blobs from the Azure Storage account.
Schedule the pipeline to run every day to perform the deletion of blobs that meet the criteria.
Therefore, the solution of scheduling an Azure Data Factory pipeline would meet the goal of identifying and deleting blobs that were not modified during the last 100 days.