Jump to: navigation, search

spark Section



  


startMode

Default Value: worker
Valid Values: off, worker, or both
Changes Take Effect: After start or restart


The mode that will be used when starting Spark. If set to off, Spark will not be started by Data Processing Server, and will instead have its state managed externally. If set to worker, only a worker node will be started. If set to both, both a worker node and a master node are started. Note: Genesys recommends that you set this option for each node to clearly specify the role. However, you can set the Cluster object to worker mode and override that value for the master node by setting that node to both.

Feedback

Comment on this article:

blog comments powered by Disqus
This page was last modified on February 23, 2017, at 13:17.