-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-8675] Executors created by LocalBackend won't get the same classpath as other executor backends #7091
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…path as other executor backends
|
Jenkins, this is ok to test. |
|
Test build #36024 has finished for PR 7091 at commit
|
|
Looks fine. @coderplay can you fix the scalastyle? |
|
@andrewor14 Sure, will do. Any existing formatter for eclipse or idea? |
|
There are some for intellij but Spark has different style requirements than typical Scala applications. Here you can just click into the last test failure and look at the bottom of the test output and fix those manually. |
|
Test build #36963 has finished for PR 7091 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like it's failing tests legitimately. You might need to add the file: scheme to the URL, e.g.
....map { p => new URL("file:/" + p.stripPrefix("/")) }
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's a solution, but what if user specify a hdfs://xx/yy.jar ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala#L202 , seems spark doesn't support non-local classpath
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if you're running in local mode it's unlikely your classpath will be from hdfs
|
Test build #36978 has finished for PR 7091 at commit
|
Seems it's caused by commit c9e2ef5 |
|
retest this please |
|
Test build #36996 has finished for PR 7091 at commit
|
|
Test build #1034 has finished for PR 7091 at commit
|
|
Merging into master, thanks! |
…sspath as other executor backends
AFAIK, some spark application always use LocalBackend to do some local initiatives, spark sql is an example. Starting a LocalPoint won't add user classpath into executor.
```java
override def start() {
localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
"LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, scheduler, this, totalCores))
}
```
Thus will cause local executor fail with these scenarios, loading hadoop built-in native libraries, loading other user defined native libraries, loading user jars, reading s3 config from a site.xml file, etc
Author: Min Zhou <[email protected]>
Closes #7091 from coderplay/master and squashes the following commits:
365838f [Min Zhou] Fixed java.net.MalformedURLException, add default scheme, support relative path
d215b7f [Min Zhou] Follows spark standard scala style, make the auto testing happy
84ad2cd [Min Zhou] Use system specific path separator instead of ','
01f5d1a [Min Zhou] Merge branch 'master' of https://github.com/apache/spark
e528be7 [Min Zhou] Merge branch 'master' of https://github.com/apache/spark
45bf62c [Min Zhou] SPARK-8675 Executors created by LocalBackend won't get the same classpath as other executor backends
|
@coderplay can you close this now? I already merged it in master. Thanks for your changes. |
AFAIK, some spark application always use LocalBackend to do some local initiatives, spark sql is an example. Starting a LocalPoint won't add user classpath into executor.
Thus will cause local executor fail with these scenarios, loading hadoop built-in native libraries, loading other user defined native libraries, loading user jars, reading s3 config from a site.xml file, etc