Skip to content

Conversation

@ueshin
Copy link
Member

@ueshin ueshin commented Nov 18, 2014

I tried to build for Scala 2.11 using sbt with the following command:

$ sbt/sbt -Dscala-2.11 assembly

but it ends with the following error messages:

[error] (streaming-kafka/*:update) sbt.ResolveException: unresolved dependency: org.apache.kafka#kafka_2.11;0.8.0: not found
[error] (catalyst/*:update) sbt.ResolveException: unresolved dependency: org.scalamacros#quasiquotes_2.11;2.0.1: not found

The reason is:
If system property -Dscala-2.11 (without value) was set, SparkBuild.scala adds scala-2.11 profile, but also sbt-pom-reader activates scala-2.10 profile instead of scala-2.11 profile because the activator PropertyProfileActivator used by sbt-pom-reader internally checks if the property value is empty or not.

The value is set to non-empty value, then no need to add profiles in SparkBuild.scala because sbt-pom-reader can handle as expected.

@SparkQA
Copy link

SparkQA commented Nov 18, 2014

Test build #23545 has started for PR 3342 at commit ce98d0f.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Nov 18, 2014

Test build #23545 has finished for PR 3342 at commit ce98d0f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/23545/
Test PASSed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we no longer need to turn the scala-2.10 profile on in this case? Will Maven turn it on automatically?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, the profile scala-2.10 is activated by the condition:

<property><name>!scala-2.11</name></property>

so Maven will turn it on if the property scala-2.11 is null or empty.

@SparkQA
Copy link

SparkQA commented Nov 18, 2014

Test build #23559 has started for PR 3342 at commit 4eef52b.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Nov 18, 2014

Test build #23559 has finished for PR 3342 at commit 4eef52b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/23559/
Test PASSed.

@pwendell
Copy link
Contributor

Hey @ueshin one question. If the PropertyProfileActivator is not triggered by blank values, then why does the maven build work when you run -Dscala-2.11? Doesn't it use this code internally as well?

@ueshin
Copy link
Member Author

ueshin commented Nov 19, 2014

@pwendell, Maven handles -Dname as -Dname=true (here) before execute Maven.

FYI: PropertyProfileActivator is here.

@pwendell
Copy link
Contributor

I see - so the properties are re-written earlier on. Good find!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a comment here and link to the maven code this is replicating? It might be tricky for other developers to understand why this exists.

@pwendell
Copy link
Contributor

I didn't test this locally, but if this works, this is a good way of doing it.

@SparkQA
Copy link

SparkQA commented Nov 19, 2014

Test build #23601 has started for PR 3342 at commit 14d86e8.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Nov 19, 2014

Test build #23601 has finished for PR 3342 at commit 14d86e8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/23601/
Test PASSed.

@pwendell
Copy link
Contributor

Thanks @ueshin - I will pull this in.

@asfgit asfgit closed this in f9adda9 Nov 19, 2014
asfgit pushed a commit that referenced this pull request Nov 19, 2014
I tried to build for Scala 2.11 using sbt with the following command:

```
$ sbt/sbt -Dscala-2.11 assembly
```

but it ends with the following error messages:

```
[error] (streaming-kafka/*:update) sbt.ResolveException: unresolved dependency: org.apache.kafka#kafka_2.11;0.8.0: not found
[error] (catalyst/*:update) sbt.ResolveException: unresolved dependency: org.scalamacros#quasiquotes_2.11;2.0.1: not found
```

The reason is:
If system property `-Dscala-2.11` (without value) was set, `SparkBuild.scala` adds `scala-2.11` profile, but also `sbt-pom-reader` activates `scala-2.10` profile instead of `scala-2.11` profile because the activator `PropertyProfileActivator` used by `sbt-pom-reader` internally checks if the property value is empty or not.

The value is set to non-empty value, then no need to add profiles in `SparkBuild.scala` because `sbt-pom-reader` can handle as expected.

Author: Takuya UESHIN <[email protected]>

Closes #3342 from ueshin/issues/SPARK-4429 and squashes the following commits:

14d86e8 [Takuya UESHIN] Add a comment.
4eef52b [Takuya UESHIN] Remove unneeded condition.
ce98d0f [Takuya UESHIN] Set non-empty value to system property "scala-2.11" if the property exists instead of adding profile.

(cherry picked from commit f9adda9)
Signed-off-by: Patrick Wendell <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants