Kubernetes uses containers to execute your jobs, Hadoop cluster doesn't use containers. Spark usually comes with Yarn (in Hadoop cluster) to help manage resources and schedule executions. With Kubernetes spark uses Kubernetes master node scheduler service to manage resources and deploy jobs.
@@MrFromminsk to be honest, Yarn uses its own container called yarn containers. it's not the same container like docker containers but it is one sort of container, and that's the difference.
Amazing talk!
How is running spark ob Kubernetes different from running spark on hadoop cluster?
Kubernetes uses containers to execute your jobs, Hadoop cluster doesn't use containers. Spark usually comes with Yarn (in Hadoop cluster) to help manage resources and schedule executions. With Kubernetes spark uses Kubernetes master node scheduler service to manage resources and deploy jobs.
@@MrFromminsk to be honest, Yarn uses its own container called yarn containers. it's not the same container like docker containers but it is one sort of container, and that's the difference.
The content is really great but the video quality poor.