Skip to content

Conversation

@gczsjdy
Copy link

@gczsjdy gczsjdy commented Mar 13, 2018

What changes were proposed in this pull request?

In some cases when outer project use pre-built Spark as dependency, getScalaVersion will fail due to launcher directory doesn't exist. This PR also checks in jars directory.

How was this patch tested?

Existing tests.

@gczsjdy gczsjdy changed the title [CORE] Better scala version check [SPARK-23667][CORE] Better scala version check Mar 13, 2018
@gczsjdy
Copy link
Author

gczsjdy commented Mar 16, 2018

cc @cloud-fan @viirya

@viirya
Copy link
Member

viirya commented Mar 16, 2018

For the case, shouldn't we just set SPARK_SCALA_VERSION?

@gczsjdy
Copy link
Author

gczsjdy commented Mar 16, 2018

@viirya Yes, but this is only for people who will investigate on Spark code, and it also requires manual efforts. Isn't it better if we get this automatically?

@vanzin
Copy link
Contributor

vanzin commented Mar 16, 2018

Can you provide more information in the bug report? e.g. a sample application and a sample error.

I don't think this is the correct change, but without your use case I'm not sure what the right change would be.

@gczsjdy
Copy link
Author

gczsjdy commented Mar 19, 2018

@vanzin Thanks. : )
I am testing using OAP with pre-built Spark on LocalClusterMode.
This is on travis and no SPARK_HOME is set.
The mvn test command will produce this error:

23:49:56.997 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error running executor java.lang.IllegalStateException: Cannot find any build directories. at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248) at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:241) at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:147) at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:118) at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39) at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:47) at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63) at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51) at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:145) at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)

@vanzin
Copy link
Contributor

vanzin commented Mar 19, 2018

This is on travis and no SPARK_HOME is set.

That sounds a little odd. If that is true, then your proposed code wouldn't work either, since it requires SPARK_HOME to be known.

In any case, there are two calls to getScalaVersion().

First is:

    boolean prependClasses = !isEmpty(getenv("SPARK_PREPEND_CLASSES"));
    boolean isTesting = "1".equals(getenv("SPARK_TESTING"));
    if (prependClasses || isTesting) {
      String scala = getScalaVersion();

And your code shouldn't be triggering that, since both env variables are for Spark development and other applications shouldn't be using them.

Second call is a little later:

    String jarsDir = findJarsDir(getSparkHome(), getScalaVersion(), !isTesting && !isTestingSql);

Here getScalaVersion() is only needed when running Spark from a git clone, not from the distribution package. So the right thing would be to move getScalaVersion() to CommandBuilderUtils, and call it from findJarsDir only if needed.

@vanzin
Copy link
Contributor

vanzin commented May 11, 2018

Do you plan to update this PR? Otherwise it should be closed.

@gczsjdy
Copy link
Author

gczsjdy commented May 13, 2018

@vanzin Sorry but I will update it in next week, thanks.

@gczsjdy
Copy link
Author

gczsjdy commented May 20, 2018

@vanzin Sorry for the late reply. According to the call stack, it's the first place that called getScalaVersion, isTest is true so we can go into that path.
This happens in travis.

@vanzin
Copy link
Contributor

vanzin commented May 22, 2018

I don't understand your reply. The testing stuff should only be true during Spark unit tests. You shouldn't be setting that in your tests because you're not testing Spark.

If you are, you should fix your testing infrastructure to not do that.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@srowen srowen mentioned this pull request Jul 3, 2018
@asfgit asfgit closed this in 5bf95f2 Jul 4, 2018
zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
Closes apache#20932
Closes apache#17843
Closes apache#13477
Closes apache#14291
Closes apache#20919
Closes apache#17907
Closes apache#18766
Closes apache#20809
Closes apache#8849
Closes apache#21076
Closes apache#21507
Closes apache#21336
Closes apache#21681
Closes apache#21691

Author: Sean Owen <[email protected]>

Closes apache#21708 from srowen/CloseStalePRs.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants