-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-35180][BUILD] Allow to build SparkR with SBT #32285
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @srowen and @HyukjinKwon |
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's great. Could you add the build command into R/README.md along with the existing mvn command, @sarutak ?
|
BTW, I can do the AS-IS master branch, what is the missing before this PR? |
O.K, I'll do it.
|
|
Got it!
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM.
R/README.md
Outdated
| #### Build Spark | ||
|
|
||
| Build Spark with [Maven](https://spark.apache.org/docs/latest/building-spark.html#buildmvn) and include the `-Psparkr` profile to build the R package. For example to use the default Hadoop versions you can run | ||
| Build Spark with [Maven](https://spark.apache.org/docs/latest/building-spark.html#buildmvn) or [SBT](http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt), and include the `-Psparkr` profile to build the R package. For example to use the default Hadoop versions you can run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, http -> https.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I just copied from the URL bar. I'll change it. Thanks.
project/SparkBuild.scala
Outdated
| Process(command.toString).!! | ||
| }, | ||
| (Compile / compile) := { | ||
| buildRPackage.value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we build SparkR after the compilation of core so the order should probably be switched with the next line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. I've updated.
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM2
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #137771 has finished for PR 32285 at commit
|
|
Test build #137778 has finished for PR 32285 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #137788 has finished for PR 32285 at commit
|
|
retest this please. |
|
Merged to master. |
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #137804 has finished for PR 32285 at commit
|
What changes were proposed in this pull request?
This PR proposes a change that allows us to build SparkR with SBT.
Why are the changes needed?
In the current master, SparkR can be built only with Maven.
It's helpful if we can built it with SBT.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
I confirmed that I can build SparkR on Ubuntu 20.04 with the following command.