Skip to content

Commit 76b8d00

Browse files
Merge pull request apache#124 from shivaram/master
Specify how to change Spark versions in README
2 parents 90c8933 + b690d58 commit 76b8d00

File tree

1 file changed

+8
-2
lines changed

1 file changed

+8
-2
lines changed

README.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ R.
99
## Installing SparkR
1010

1111
### Requirements
12-
SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default uses the 1.1.0
13-
candidate from the Apache staging repositories. You can also build SparkR against a
12+
SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default uses
13+
Apache Spark 1.1.0. You can also build SparkR against a
1414
different Spark version (>= 0.9.0) by modifying `pkg/src/build.sbt`.
1515

1616
SparkR also requires the R package `rJava` to be installed. To install `rJava`,
@@ -29,6 +29,12 @@ If you wish to try out the package directly from github, you can use [`install_g
2929
library(devtools)
3030
install_github("amplab-extras/SparkR-pkg", subdir="pkg")
3131

32+
SparkR by default uses Apache Spark 1.1.0. You can switch to a different Spark
33+
version by setting the environment variable `SPARK_VERSION`. For example, to
34+
use Apache Spark 1.2.0, you can run
35+
36+
SPARK_VERSION=1.2.0 ./install-dev.sh
37+
3238
SparkR by default links to Hadoop 1.0.4. To use SparkR with other Hadoop
3339
versions, you will need to rebuild SparkR with the same version that [Spark is
3440
linked

0 commit comments

Comments
 (0)