You plan to store 20 TB of data in Azure. The data will be accessed infrequently and visualized by using Microsoft Power BI.
You need to recommend a storage solution for the data.
Which two solutions should you recommend? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
Click on the arrows to vote for the correct answer
A. B. C. D. E.AC
You can use Power BI to analyze and visualize data stored in Azure Data Lake and Azure SQL Data Warehouse.
Azure Data Lake includes all of the capabilities required to make it easy for developers, data scientists and analysts to store data of any size and shape and at any speed, and do all types of processing and analytics across platforms and languages. It removes the complexities of ingesting and storing all your data while making it faster to get up and running with batch, streaming and interactive analytics. It also integrates seamlessly with operational stores and data warehouses so that you can extend current data applications.
https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-power-bi https://azure.microsoft.com/en-gb/solutions/data-lake/ https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-power-biBased on the requirements provided, the best storage solutions for storing 20 TB of data in Azure for infrequent access and visualization using Power BI are Azure Data Lake and Azure SQL Data Warehouse.
A. Azure Data Lake: Azure Data Lake is a scalable data storage and analytics service that allows you to store and process large amounts of data. It provides enterprise-grade security features and can store data of any size, type, or format. The service is ideal for big data processing scenarios, such as data exploration, machine learning, and real-time analytics. With Azure Data Lake, you can store and analyze unstructured, semi-structured, and structured data. Since the data will be accessed infrequently, you can store it in Azure Data Lake's cold storage tier to reduce costs.
B. Azure SQL Data Warehouse: Azure SQL Data Warehouse is a fully managed, petabyte-scale cloud-based data warehouse that provides high-performance analytics for your data. It can store and process large amounts of data and provides query performance and scalability to support big data analytics. The service can also integrate with other Azure services, such as Azure Data Factory, Azure Databricks, and Azure Analysis Services. With Azure SQL Data Warehouse, you can use Power BI to visualize the data and perform ad-hoc analysis.
C. Azure Cosmos DB: Azure Cosmos DB is a globally distributed, multi-model database service that supports NoSQL data models such as MongoDB, Cassandra, and Azure Table storage. It provides high scalability, low latency, and automatic indexing, making it ideal for real-time applications and globally distributed workloads. However, since the data is accessed infrequently, Azure Cosmos DB may not be the most cost-effective option.
D. Azure SQL Database: Azure SQL Database is a fully managed relational database service that provides high availability, scalability, and security features. It supports several deployment options, including single database, elastic pool, and managed instance. However, since the data will be accessed infrequently, Azure SQL Database may not be the most cost-effective option.
E. Azure Database for PostgreSQL: Azure Database for PostgreSQL is a fully managed PostgreSQL database service that provides high availability, scalability, and security features. It supports several deployment options, including single server, flexible server, and Hyperscale (Citus) server. However, since the data will be accessed infrequently, Azure Database for PostgreSQL may not be the most cost-effective option.
In conclusion, based on the requirements provided, the best storage solutions for storing 20 TB of data in Azure for infrequent access and visualization using Power BI are Azure Data Lake and Azure SQL Data Warehouse.
The solution that provides additional resources to users while minimizing capital and operational expenditure costs is a hybrid cloud.
A hybrid cloud is a combination of on-premises infrastructure and cloud services, allowing you to take advantage of both environments. It provides you with the flexibility to host critical workloads on-premises while still taking advantage of the public cloud's scalability and cost savings.
Migrating 100 servers to the public cloud is likely to incur significant capital expenditure costs, as you will need to pay for cloud services and potentially incur migration costs. Additionally, the operational expenditure costs will depend on how the servers are currently being used and whether they can be migrated to the cloud. A complete migration to the public cloud may not be the most cost-effective solution.
Adding an additional data center may provide additional resources, but it will also incur significant capital and operational expenditure costs. You will need to purchase additional hardware, software, and infrastructure to support the new data center, and you will also need to manage the additional resources, which will increase operational expenditure costs.
A private cloud may provide additional resources, but it will still require significant capital and operational expenditure costs to build and maintain the infrastructure. A private cloud is also limited in its scalability compared to the public cloud, making it less flexible.
In summary, a hybrid cloud is the most cost-effective solution as it provides you with the flexibility to host critical workloads on-premises while still taking advantage of the public cloud's scalability and cost savings, while minimizing capital and operational expenditure costs.