Skip to content

Conversation

@coderplay
Copy link
Contributor

AFAIK, some spark application always use LocalBackend to do some local initiatives, spark sql is an example. Starting a LocalPoint won't add user classpath into executor.

  override def start() {
    localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
      "LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, scheduler, this, totalCores))
  }

Thus will cause local executor fail with these scenarios, loading hadoop built-in native libraries, loading other user defined native libraries, loading user jars, reading s3 config from a site.xml file, etc

@JoshRosen
Copy link
Contributor

Jenkins, this is ok to test.

@SparkQA
Copy link

SparkQA commented Jun 29, 2015

Test build #36024 has finished for PR 7091 at commit 45bf62c.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • case class Sha1(child: Expression) extends UnaryExpression with ExpectsInputTypes

@andrewor14
Copy link
Contributor

Looks fine. @coderplay can you fix the scalastyle?

@coderplay
Copy link
Contributor Author

@andrewor14 Sure, will do. Any existing formatter for eclipse or idea?

@andrewor14
Copy link
Contributor

There are some for intellij but Spark has different style requirements than typical Scala applications. Here you can just click into the last test failure and look at the bottom of the test output and fix those manually.

@SparkQA
Copy link

SparkQA commented Jul 9, 2015

Test build #36963 has finished for PR 7091 at commit d215b7f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like it's failing tests legitimately. You might need to add the file: scheme to the URL, e.g.
....map { p => new URL("file:/" + p.stripPrefix("/")) }

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's a solution, but what if user specify a hdfs://xx/yy.jar ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you're running in local mode it's unlikely your classpath will be from hdfs

@SparkQA
Copy link

SparkQA commented Jul 10, 2015

Test build #36978 has finished for PR 7091 at commit 365838f.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@coderplay
Copy link
Contributor Author

======================================================================
FAIL: test_time_with_timezone (__main__.SQLTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/sql/tests.py", line 716, in test_time_with_timezone
    self.assertEqual(now, now1)
AssertionError: datetime.datetime(2015, 7, 9, 18, 12, 37, 910045) != datetime.datetime(2015, 7, 9, 18, 12, 37, 910044)

Seems it's caused by commit c9e2ef5

@andrewor14
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Jul 10, 2015

Test build #36996 has finished for PR 7091 at commit 365838f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jul 10, 2015

Test build #1034 has finished for PR 7091 at commit 365838f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor

Merging into master, thanks!

coderplay added a commit that referenced this pull request Jul 10, 2015
…sspath as other executor backends

AFAIK, some spark application always use LocalBackend to do some local initiatives, spark sql is an example. Starting a LocalPoint won't add user classpath into executor.
```java
  override def start() {
    localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
      "LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, scheduler, this, totalCores))
  }
```
Thus will cause local executor fail with these scenarios, loading hadoop built-in native libraries, loading other user defined native libraries, loading user jars, reading s3 config from a site.xml file, etc

Author: Min Zhou <[email protected]>

Closes #7091 from coderplay/master and squashes the following commits:

365838f [Min Zhou] Fixed java.net.MalformedURLException, add default scheme, support relative path
d215b7f [Min Zhou] Follows spark standard scala style, make the auto testing happy
84ad2cd [Min Zhou] Use system specific path separator instead of ','
01f5d1a [Min Zhou] Merge branch 'master' of https://github.com/apache/spark
e528be7 [Min Zhou] Merge branch 'master' of https://github.com/apache/spark
45bf62c [Min Zhou] SPARK-8675 Executors created by LocalBackend won't get the same classpath as other executor backends
@andrewor14
Copy link
Contributor

@coderplay can you close this now? I already merged it in master. Thanks for your changes.

@coderplay coderplay closed this Jul 10, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants