You work as a machine learning specialist for an electric bicycle company.
The electric bicycles your company produces have IoT sensors on them that transmit usage and maintenance information to your company data lake.
You are using Kinesis Data Streams to gather the bicycle IoT data and store it into an S3 data store that you can use for your machine learning models.
You are on the team that has the assignment of using the IoT data to predict when your customer's electric bicycles need maintenance. The IoT data that the electric bicycles produce is unstructured, and sometimes, depending on the manufacturer of the IoT part, the data has a different schema structure.
You need to clean and classify the IoT data before using it in your machine learning model.
How can you build an ETL script to perform the necessary cleaning and classification knowing that you have message data with differing schema structures?
Click on the arrows to vote for the correct answer
A. B. C. D.Answer: C.
Option A is incorrect.
There is no DataRecord construct in Apache Spark SparkSQL.
Option B is incorrect.
The Apache Spark SparkSQL DataFrame does not efficiently handle data with unknown schema structure.
This option would produce suboptimal results.
Option C is correct.
The AWS Glue DynamicFrame allows each record to be self-describing to handle unknown or changing schemas.
Option D is incorrect.
DynamicRecord represents a logical record within a DynamicFrame.
It is a row in a DynamicFrame.
So you wouldn't pass individual DynamicRecords from transform to transform.
You pass a DynamicFrame.
Reference:
Please see the AWS Glue developer guide titled Machine Learning Transforms in AWS Glue, and the AWS Glue developer guide titled DynamicFrame Class.
In this scenario, the electric bicycle company has a requirement to predict when their customers' electric bicycles need maintenance using IoT data collected from sensors. The IoT data produced by electric bicycles is unstructured and may have varying schema structures depending on the manufacturer of the IoT part. Therefore, an ETL (Extract, Transform, Load) script is needed to clean and classify the IoT data before using it in a machine learning model.
AWS Glue is a fully managed ETL service provided by AWS that simplifies the task of building ETL scripts. Glue allows us to create data processing workflows and automates much of the heavy lifting involved in building ETL scripts. Glue provides a flexible and scalable infrastructure to perform data processing tasks on data stored in various data stores, including S3.
To build an ETL script using Glue, we can use a series of transforms to clean and transform the data. Transforms are operations that take input data, apply some processing, and produce output data. Glue provides three types of transforms - DynamicFrame, DataFrame, and DataCatalog.
DynamicFrames are an extension of Apache Spark DataFrames and are used for semi-structured data processing. DynamicFrames can handle various data formats such as CSV, JSON, and Avro. They can also handle data with varying schema structures. Using DynamicFrames, we can perform various transformations such as filtering, joining, mapping, and aggregating on the data.
DataFrames are a standard Apache Spark data structure used for structured data processing. They are based on a distributed collection of data organized into named columns. DataFrames can handle various data formats such as CSV, JSON, and Parquet. They are optimized for data processing tasks that involve filtering, aggregation, and projection.
DataCatalog is a metadata repository that stores metadata about data sources, databases, tables, and transforms. We can use DataCatalog to manage and discover metadata for Glue workflows.
In the given scenario, since the IoT data produced by electric bicycles is unstructured and may have varying schema structures, we can use DynamicFrames to handle the data. Therefore, options C and D are more appropriate than options A and B.
Option C is the correct answer because DynamicFrames are specifically designed for semi-structured data processing and can handle varying schema structures. Using DynamicFrames, we can perform various transformations such as filtering, joining, mapping, and aggregating on the data.
Option D is incorrect because DynamicRecord is not a valid transform type in Glue.
Option A is incorrect because it suggests using Apache Spark SparkSQL DataRecord, which is not a valid transform type in Glue.
Option B is incorrect because it suggests using Apache Spark SparkSQL DataFrames, which are designed for structured data processing and may not handle varying schema structures.
In summary, when building an ETL script to clean and classify IoT data with varying schema structures, it is recommended to use AWS Glue with DynamicFrames.