Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage account that contains 100 GB of files. The files contain text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.
You plan to copy the data from the storage account to an Azure SQL data warehouse.
You need to prepare the files to ensure that the data copies quickly.
Solution: You modify the files to ensure that each row is more than 1 MB.
Does this meet the goal?
Click on the arrows to vote for the correct answer
A. B.B
Instead modify the files to ensure that each row is less than 1 MB.
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/guidance-for-loading-dataThe proposed solution of modifying the files to ensure that each row is more than 1 MB does not meet the goal of copying the data quickly. This solution will increase the size of each row, but it will not improve the speed of the data transfer.
To optimize the data transfer from the Azure Storage account to Azure SQL Data Warehouse, the following steps can be taken:
Additionally, depending on the size of the data and the network bandwidth available, it may be necessary to split the data into smaller chunks and transfer them in parallel to improve the transfer speed.
In conclusion, modifying the files to ensure that each row is more than 1 MB does not address the goal of copying the data quickly. Other measures such as optimizing the data format, compressing the data, using Azure Data Factory, and parallelizing the data transfer can be taken to achieve the goal. Therefore, the answer is B. No.