Skip to content

Conversation

@zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Sep 8, 2015

Throw a more readable exception. Please help review. Thanks

@zjffdu
Copy link
Contributor Author

zjffdu commented Sep 8, 2015

Paste the new exception after this patch.

15/09/08 10:05:52 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not find jar with spark-yarn related code,Make sure SPARK_PREPEND_CLASSES is not set
    at org.apache.spark.deploy.yarn.Client$$anonfun$org$apache$spark$deploy$yarn$Client$$sparkJar$2.apply(Client.scala:1048)
    at org.apache.spark.deploy.yarn.Client$$anonfun$org$apache$spark$deploy$yarn$Client$$sparkJar$2.apply(Client.scala:1048)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$sparkJar(Client.scala:1048)
    at org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:1160)
    at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:534)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:645)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
    at com.zjffdu.tutorial.spark.WordCount$.main(WordCount.scala:24)
    at com.zjffdu.tutorial.spark.WordCount.main(WordCount.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:680)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: this line is too long (100 chars).

@holdenk
Copy link
Contributor

holdenk commented Sep 8, 2015

Wouldn't it make more sense to check for this in SparkSubmitArguments or maybe the yarn specific validation code?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want to be more helpful, you can also mention that the user can set spark.yarn.jar to force Spark to use a specific Spark archive instead of trying to figure it out from the classpath.

@vanzin
Copy link
Contributor

vanzin commented Sep 8, 2015

Wouldn't it make more sense to check for this in SparkSubmitArguments or maybe the yarn specific validation code?

I'm not sure there's a common enough place where you could call this code, at least without reflection. You could try to copy it, but duplicated code is bad. In any case, this should only really affect Spark developers, so while it's nice to have a better message, we shouldn't need to be too fancy here.

@vanzin
Copy link
Contributor

vanzin commented Sep 8, 2015

ok to test

@holdenk
Copy link
Contributor

holdenk commented Sep 8, 2015

@vanzin makes sense if its only going to impact spark developers (thought it might also be a user facing exception).

@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42155 has finished for PR 8649 at commit 39fe23a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Sep 9, 2015

@zjffdu could you re-word the message? I'd suggest something like:

Could not find jar containing Spark classes. The jar can be defined using the
spark.yarn.jar configuration option. If testing Spark, either set that option or make
sure SPARK_PREPEND_CLASSES is not set.

@zjffdu
Copy link
Contributor Author

zjffdu commented Sep 9, 2015

Thanks @vanzin
update the patch with re-word message.

@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42194 has finished for PR 8649 at commit 08f7c9a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Sep 9, 2015

LGTM, merging.

@asfgit asfgit closed this in c0052d8 Sep 9, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants