Jump to: navigation, search

spark Section



Default Value: worker
Valid Values: off, worker, or both
Changes Take Effect: After start or restart

The mode that will be used when starting Spark. If set to off, Spark will not be started by Data Processing Server, and will instead have its state managed externally. If set to worker, only a worker node will be started. If set to both, both a worker node and a master node are started. Note: Genesys recommends that you set this option for each node to clearly specify the role. However, you can set the Cluster object to worker mode and override that value for the master node by setting that node to both.

This page was last edited on February 23, 2017, at 21:17.


Comment on this article:

blog comments powered by Disqus