How to check the number of cores Spark uses? -
i have spark.cores.max
set 24
[3 worker nodes], if inside worker node , see there 1 process [command = java] running consumes memory , cpu. suspect not use 8 cores (on m2.4x large
).
how know number?
you can see number of cores occupied on each worker in cluster under spark web ui:
Comments
Post a Comment