Memory Calculations In Spark
- Dynamic Resource Allocator(DRA)-Spark itself take care of memory calculations which highly not prefer.
- Cores: No of concurrent tasks that can be executed in each node.
- 5 cores which means 5 parallel tasks can execute for a executor.
- No of executors=(Total no of cores)/(No of cores in a node)
- Executor Memory=((RAM)/(No of executors))-(Yarn Memory(2gb))=x
- Executor memory range is (no of cores-x)
- Driver Memory varies based on condition.
Comments
Post a Comment