Importing 70 TB of Data to Azure: Cost-effective Solution
Question
You have 70 TB of files on your on-premises file server.
You need to recommend solution for importing data to Azure. The solution must minimize cost.
What Azure service should you recommend?
Answers
Explanations
Click on the arrows to vote for the correct answer
A. B. C. D.C
Microsoft has engineered an extremely powerful solution that helps customers get their data to the Azure public cloud in a cost-effective, secure, and efficient manner with powerful Azure and machine learning at play. The solution is called Data Box.
Data Box and is in general availability status. It is a rugged device that allows organizations to have 100 TB of capacity on which to copy their data and then send it to be transferred to Azure.
Incorrect Answers:
A: StoreSimple would not be able to handle 70 TB of data.
https://www.vembu.com/blog/what-is-microsoft-azure-data-box-disk-edge-heavy-gateway-overview/The best option for importing a large amount of data to Azure while minimizing cost is to use Azure Data Box. Azure Data Box is a service designed for securely transferring large amounts of data to Azure. It provides a rugged, tamper-resistant appliance that can be shipped to your location, loaded with data, and shipped back to Azure for upload. This solution is designed to be cost-effective for transferring large amounts of data, as it minimizes the time and bandwidth required for transferring data over a network.
Azure StorSimple is a hybrid cloud storage solution that integrates on-premises storage arrays with cloud-based storage. While it can be used for transferring data to Azure, it is not the most cost-effective solution for large-scale data transfers.
Azure Batch is a service for running large-scale parallel and batch computing jobs. While it can be used for data processing and analysis, it is not the best solution for importing large amounts of data to Azure.
Azure Stack Hub is an extension of Azure that allows you to run Azure services on-premises