Your company has a stateless web API that performs scientific calculations.
The web API runs on a single Google Kubernetes Engine (GKE) cluster.
The cluster is currently deployed in us-central1
Your company has expanded to offer your API to customers in Asia.
You want to reduce the latency for users in Asia.
What should you do?
Click on the arrows to vote for the correct answer
A. B. C. D.B.
To reduce latency for users in Asia, the company needs to deploy the web API closer to the users in the region. One way to achieve this is to create a second GKE cluster in a location that is geographically closer to Asia.
Option A suggests creating a second GKE cluster in asia-southeast1 and exposing both APIs using a Service of type LoadBalancer. This would distribute the traffic between the two clusters based on the load, but it would not necessarily ensure the lowest latency for users in Asia. Additionally, managing two separate clusters can be complex and costly.
Option B suggests using a global HTTP(s) load balancer with Cloud CDN enabled. This solution would leverage Google's global network to route traffic to the closest available cluster, reducing latency for users in Asia. The Cloud CDN feature would cache static content and serve it from Google's global edge network, reducing the number of requests that need to be routed to the GKE clusters. This solution would be more cost-effective than maintaining two separate clusters.
Option C suggests creating a second GKE cluster in asia-southeast1 and using kubemci to create a global HTTP(s) load balancer. Kubemci (Kubernetes Multiple Cluster Ingress) is a tool that simplifies the deployment of multiple Kubernetes clusters and enables global load balancing. This solution is similar to Option B but involves more complexity and management overhead.
Option D suggests increasing the memory and CPU allocated to the application in the existing cluster. This would not necessarily reduce latency for users in Asia, as it does not address the distance between the users and the cluster.
Overall, the most effective solution to reduce latency for users in Asia is to use a global HTTP(s) load balancer with Cloud CDN enabled, as suggested in Option B.