Jump to: navigation, search

spark Section



  


executorMemory

Default Value: None
Valid Values: Valid memory limit
Changes Take Effect: After start or restart


Use this option to manage the amount of memory used by Spark for executing tasks on each node. Genesys recommends at least two gigabytes per node, but more memory can improve performance if hardware allows. For information about the format, consult the Spark documentation.

host

Default Value: None
Valid Values: hostname of the Spark Master node
Changes Take Effect: After start or restart


The name of the Spark Master host. The value should be the same as what Java's InetAddress.getLocalHost() would return for the specified host.

masterWebPort

Default Value: 8080
Valid Values: Valid port number
Changes Take Effect: After start or restart


The number of the TCP port that the Spark Master web UI will listen on. Note that this option is provided for cases when the default port has already been used by another service.

port

Default Value: 7077
Valid Values: Valid port number
Changes Take Effect: After start or restart


The port number of the Spark Master host.

sparkHeartbeatTimeout

Default Value: 60
Valid Values: Positive integer
Changes Take Effect: After start or restart


The timeout value in seconds between two heartbeat calls to the Spark metrics API.

sparkStartTimeout

Default Value: 20
Valid Values: Positive integer
Changes Take Effect: After start or restart


The timeout value in seconds between a Spark start or restart and the first time its API is checked. On slower machines, it makes sense to increase this value so that Spark has enough time to start successfully (without initiating a restart cycle).

startMode

Default Value: worker
Valid Values: off, worker, or both
Changes Take Effect: After start or restart


The mode that will be used when starting Spark. If set to off, Spark will not be started by Data Processing Server, and will instead have its state managed externally. If set to worker, only a worker node will be started. If set to both, both a worker node and a master node are started. Note: Genesys recommends that you set this option for each node to clearly specify the role. However, you can set the Cluster object to worker mode and override that value for the master node by setting that node to both.

uri

Default Value: None
Valid Values: Valid Spark URI
Changes Take Effect: After start or restart


Advanced. For situations when Spark is running externally, you must set the URI instead of the host and port. The URI must include the protocol, in addition to the host and port.

workerWebPort

Default Value: 8081
Valid Values: Valid port number
Changes Take Effect: After start or restart


The number of the TCP port that the Spark Worker web UI will listen on. Note that this option is provided for cases when the default port has already been used by another service.

Feedback

Comment on this article:

blog comments powered by Disqus
This page was last modified on February 23, 2017, at 13:17.