Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Using Match Statement and updating config property location in docume…
…ntation
  • Loading branch information
akshatb1 committed May 1, 2020
commit a93ce76d8205da6eb01ab41644c57504174d405d
24 changes: 13 additions & 11 deletions core/src/main/scala/org/apache/spark/deploy/Client.scala
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,8 @@ private class ClientEndpoint(

private val lostMasters = new HashSet[RpcAddress]
private var activeMasterEndpoint: RpcEndpointRef = null
private val waitAppCompletion = conf.getBoolean("spark.submit.waitAppCompletion", false)
private val waitAppCompletion = conf.getBoolean("spark.standalone.submit.waitAppCompletion",
false)
private val REPORT_DRIVER_STATUS_INTERVAL = 1000


Expand Down Expand Up @@ -156,19 +157,20 @@ private class ClientEndpoint(
System.exit(-1)
case _ =>
if (!waitAppCompletion) {
logInfo(s"No exception found and waitAppCompletion is false, " +
s"exiting spark-submit JVM.")
System.exit(0)
} else if (statusResponse.state.get == DriverState.FINISHED ||
statusResponse.state.get == DriverState.FAILED ||
statusResponse.state.get == DriverState.ERROR ||
statusResponse.state.get == DriverState.KILLED) {
logInfo(s"waitAppCompletion is true, state is ${statusResponse.state.get}, " +
logInfo(s"spark-submit not configured to wait for completion, " +
s"exiting spark-submit JVM.")
System.exit(0)
} else {
logTrace(s"waitAppCompletion is true, state is ${statusResponse.state.get}," +
s"continue monitoring driver status.")
statusResponse.state.get match {
case DriverState.FINISHED | DriverState.FAILED |
DriverState.ERROR | DriverState.KILLED =>
logInfo(s"State of $driverId is ${statusResponse.state.get}, " +
s"exiting spark-submit JVM.")
System.exit(0)
case _ =>
logTrace(s"State of $driverId is ${statusResponse.state.get}," +
s"continue monitoring driver status.")
}
}
}
} else {
Expand Down
28 changes: 18 additions & 10 deletions docs/spark-standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -240,16 +240,6 @@ SPARK_MASTER_OPTS supports the following system properties:
</td>
<td>1.6.3</td>
</tr>
<tr>
<td><code>spark.submit.waitAppCompletion</code></td>
<td><code>true</code></td>
<td>
In Standalone cluster mode, controls whether the client waits to exit until the application completes.
If set to <code>true</code>, the client process will stay alive reporting the application's status.
Otherwise, the client process will exit after submission.
</td>
<td>3.1.0</td>
</tr>
<tr>
<td><code>spark.worker.timeout</code></td>
<td>60</td>
Expand Down Expand Up @@ -384,6 +374,24 @@ To run an interactive Spark shell against the cluster, run the following command

You can also pass an option `--total-executor-cores <numCores>` to control the number of cores that spark-shell uses on the cluster.

#Spark Properties

Spark applications supports the following configuration properties specific to standalone Mode:
<table class="table">
<tr><th style="width:21%">Property Name</th><th>Default Value</th><th>Meaning</th><th>Since Version</th></tr>
<tr>
<td><code>spark.standalone.submit.waitAppCompletion</code></td>
<td><code>false</code></td>
<td>
In Standalone cluster mode, controls whether the client waits to exit until the application completes.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Standalone -> standalone
reporting -> polling

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated in the latest commit.

If set to <code>true</code>, the client process will stay alive reporting the application's status.
Otherwise, the client process will exit after submission.
</td>
<td>3.1.0</td>
</tr>
</table>


# Launching Spark Applications

The [`spark-submit` script](submitting-applications.html) provides the most straightforward way to
Expand Down