Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion R/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,14 @@ export R_HOME=/home/username/R

#### Build Spark

Build Spark with [Maven](https://spark.apache.org/docs/latest/building-spark.html#buildmvn) and include the `-Psparkr` profile to build the R package. For example to use the default Hadoop versions you can run
Build Spark with [Maven](https://spark.apache.org/docs/latest/building-spark.html#buildmvn) or [SBT](https://spark.apache.org/docs/latest/building-spark.html#building-with-sbt), and include the `-Psparkr` profile to build the R package. For example to use the default Hadoop versions you can run

```bash
# Maven
./build/mvn -DskipTests -Psparkr package

# SBT
./build/sbt -Psparkr package
```

#### Running sparkR
Expand Down
23 changes: 23 additions & 0 deletions project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -414,6 +414,10 @@ object SparkBuild extends PomBuild {

enable(YARN.settings)(yarn)

if (profiles.contains("sparkr")) {
enable(SparkR.settings)(core)
}

/**
* Adds the ability to run the spark shell directly from SBT without building an assembly
* jar.
Expand Down Expand Up @@ -888,6 +892,25 @@ object PySparkAssembly {

}

object SparkR {
import scala.sys.process.Process

val buildRPackage = taskKey[Unit]("Build the R package")
lazy val settings = Seq(
buildRPackage := {
val command = baseDirectory.value / ".." / "R" / "install-dev.sh"
Process(command.toString).!!
},
(Compile / compile) := (Def.taskDyn {
val c = (Compile / compile).value
Def.task {
(Compile / buildRPackage).value
c
}
}).value
)
}

object Unidoc {

import BuildCommons._
Expand Down