How to check the number of cores Spark uses? -


i have spark.cores.max set 24 [3 worker nodes], if inside worker node , see there 1 process [command = java] running consumes memory , cpu. suspect not use 8 cores (on m2.4x large).

how know number?

you can see number of cores occupied on each worker in cluster under spark web ui: spark web ui


Comments

Popular posts from this blog

c# - HttpResponseMessage System.InvalidOperationException -

sql - Postgresql error: "failed to find conversion function from unknown to text" -