How to check the number of cores Spark uses? -


i have spark.cores.max set 24 [3 worker nodes], if inside worker node , see there 1 process [command = java] running consumes memory , cpu. suspect not use 8 cores (on m2.4x large).

how know number?

you can see number of cores occupied on each worker in cluster under spark web ui: spark web ui


Comments

Popular posts from this blog

javascript - how to protect a flash video from refresh? -

android - Associate same looper with different threads -

visual studio 2010 - Connect to informix database windows form application -