Skip to content
Closed
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,8 @@ private[rest] class CreateSubmissionRequest extends SubmitRestProtocolRequest {
super.doValidate()
assert(sparkProperties != null, "No Spark properties set!")
assertFieldIsSet(appResource, "appResource")
assertFieldIsSet(appArgs, "appArgs")
assertFieldIsSet(environmentVariables, "environmentVariables")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if there are no args or environment variables for a particular job? Is the caller expected to pass in an empty array?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually If the caller wouldn't set "appArgs" or "environmentVariables" was causing a null pointer and leaving the Dispatcher inactive. So now I think the caller should pass an empty array, I could add a test for that case @susanxhuynh .

assertPropertyIsSet("spark.app.name")
assertPropertyIsBoolean("spark.driver.supervise")
assertPropertyIsNumeric("spark.driver.cores")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
message.clientSparkVersion = "1.2.3"
message.appResource = "honey-walnut-cherry.jar"
message.mainClass = "org.apache.spark.examples.SparkPie"
message.appArgs = Array("hdfs://tmp/auth")
message.environmentVariables = Map("SPARK_HOME" -> "/test")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There should be a check at the end of this test to make sure these fields show up in the newMessage object. (around L.125)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and probably don't want to set SPARK_HOME either? :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung okay done :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@susanxhuynh I've checked what you said but those variables are overwritten below, so I can't actually check it, right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Gschiavon You're right, I didn't see those variables were already set below. But, I do have a question now about whether these fields are optional or not. In the original test, (L.93), there's a comment about "optional fields", and the app args and env vars are tested there (L.105-106). Since this test was added with Standalone mode in mind, I wonder if these fields are considered optional there? There's no documentation as you said so we can't know for sure. I just want to make sure the extra validation doesn't break any existing applications (in Standalone cluster mode).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I was looking at RestSubmissionClient (used by Mesos and Standalone) - the client does always set 'appArgs' and 'envrionmentVariables': https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L427 So, it's only a bug when the user does not use the RestSubmissionClient (by using 'curl' directly to the server, for example). So, that addresses my concern about Standalone mode. As to whether to add a similar check in the Standalone class, I don't have a strong opinion about it (fix it here or fix in another PR).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect then @susanxhuynh . Fix it maybe in another PR?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's fine with me.

Copy link
Contributor Author

@Gschiavon Gschiavon Nov 30, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, I can do it when I have some time.

Let me know if I have to something else here! @felixcheung

Thanks @susanxhuynh

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

val conf = new SparkConf(false)
conf.set("spark.app.name", "SparkPie")
message.sparkProperties = conf.getAll.toMap
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,12 @@ private[mesos] class MesosSubmitRequestServlet(
val mainClass = Option(request.mainClass).getOrElse {
throw new SubmitRestMissingFieldException("Main class is missing.")
}
val appArgs = Option(request.appArgs).getOrElse {
throw new SubmitRestMissingFieldException("Application arguments are missing.")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: maybe put the fields here also? Something like Application arguments ("appArgs") are missing. Similar below.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done @ArtRand

}
val environmentVariables = Option(request.environmentVariables).getOrElse {
throw new SubmitRestMissingFieldException("Environment variables are missing.")
}

// Optional fields
val sparkProperties = request.sparkProperties
Expand All @@ -91,8 +97,6 @@ private[mesos] class MesosSubmitRequestServlet(
val superviseDriver = sparkProperties.get("spark.driver.supervise")
val driverMemory = sparkProperties.get("spark.driver.memory")
val driverCores = sparkProperties.get("spark.driver.cores")
val appArgs = request.appArgs
val environmentVariables = request.environmentVariables
val name = request.sparkProperties.getOrElse("spark.app.name", mainClass)

// Construct driver description
Expand Down