Capturing Massive Volumes of Video Streams for Smart City Initiatives - AWS Certified Big Data Specialty Exam Preparation

Hyderabad's Video Streams with Vivotek IB9371 - HT Cameras: Live Playback and Monitoring

Question

As a part of the smart city initiatives, Hyderabad (GHMC), one of the largest cities in southern India is working on capturing massive volumes of video streams 24/7 captured from the large numbers of “Vivotek IB9371 - HT” cameras installed at traffic lights, parking lots, shopping malls, and just about every public venue to help solve traffic problems, help prevent crime, dispatch emergency responders, and much more.

GHMC uses AWS to host their entire infrastructure. The camera's write stream into Kinesis Video Stream securely and eventually consumed by applications for custom video processing, on-demand video playback and also consumed by AWS Rekognition for video analytics.

The data is consumed to fulfill 2 requirements Kinesis Video Stream is accessed for live playback and to view archived videos (minimum latency of 5-10 seconds) to inspect by the Investigation team Kinesis Video Stream is accessed for live streaming by Operations and Monitoring team to monitor (low latency, almost near real-time) How can this be achieved? select 2 options.

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

Answer: B,C.

Option A is incorrect -HTTP Live Streaming (HLS) is an industry-standard HTTP-based media streaming communications protocol.

You can use HLS to view an Amazon Kinesis Video Stream, either for live playback or to view archived video.

You can use HLS for live playback.

Latency is typically between 3 and 5 seconds, but it can be between 1 and 10 seconds, depending on the use case, player, and network conditions

https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/how-hls.html

Option B is correct -HTTP Live Streaming (HLS) is an industry-standard HTTP-based media streaming communications protocol.

You can use HLS to view an Amazon Kinesis Video Stream, either for live playback or to view archived video.

You can use HLS for live playback.

Latency is typically between 3 and 5 seconds, but it can be between 1 and 10 seconds, depending on the use case, player, and network conditions

https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/how-hls.html

Option C is correct -You use the GetMedia API to build your own applications to process Kinesis Video Streams.

GetMedia is a real-time API with low latency.

If you want to create a player that uses GetMedia, you have to build it yourself.

For information about how to develop an application that displays a Kinesis Video Stream using GetMedia

https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/how-hls.html

Option D is incorrect -You use the GetMedia API to build your own applications to process Kinesis Video Streams.

GetMedia is a real-time API with low latency.

If you want to create a player that uses GetMedia, you have to build it yourself.

For information about how to develop an application that displays a Kinesis Video Stream using GetMedia

https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/how-hls.html

The requirement for the smart city initiative is to capture massive volumes of video streams 24/7 from various public venues such as traffic lights, parking lots, and shopping malls. These streams are then stored securely in AWS Kinesis Video Stream for custom video processing and video analytics using AWS Rekognition. There are two requirements for consuming the data:

  1. Kinesis Video Stream is accessed for live playback and to view archived videos (minimum latency of 5-10 seconds) to inspect by the Investigation team.
  2. Kinesis Video Stream is accessed for live streaming by Operations and Monitoring team to monitor (low latency, almost near real-time).

To fulfill these requirements, the following two options can be used:

Option A: Operations and Monitoring team use an application using HLS (HTTP Live Streaming) to access videos to monitor the live situation. HLS is a widely used protocol for live streaming and video on demand. It is supported by many devices and web browsers. HLS works by breaking the video stream into small segments and delivering them to the client device over HTTP. The client then reassembles the segments into a continuous stream for playback. HLS allows for adaptive streaming, which means the video quality can be adjusted based on network conditions.

Option B: Investigation team use an application using HLS (HTTP Live Streaming) to access videos to inspect by replay videos and view identified archived videos. The investigation team needs to have access to both live and archived videos with a minimum latency of 5-10 seconds. HLS can be used to stream the archived videos as well. The investigation team can use HLS-based applications to replay the videos and view the identified archived videos.

Option C: Operations and Monitoring team use an application using the 'GetMedia' API, along with the Stream Parser Library to access videos to monitor the live situation. The 'GetMedia' API is used to retrieve media content from Kinesis Video Stream. The Stream Parser Library can be used to parse the media content and extract individual frames or metadata. By using this option, the Operations and Monitoring team can access the live video stream with low latency, almost near real-time.

Option D: Investigation team use an application using the 'GetMedia' API, along with the Stream Parser Library to access videos to inspect by replay videos and view identified archived videos. Similar to Option C, the investigation team can use the 'GetMedia' API and Stream Parser Library to access both live and archived videos. By parsing the media content, the investigation team can extract individual frames or metadata for further analysis.

In summary, to achieve the two requirements, either HLS-based applications or 'GetMedia' API with Stream Parser Library can be used. Option A and B use HLS-based applications, while Option C and D use 'GetMedia' API with Stream Parser Library.