File tree Expand file tree Collapse file tree 1 file changed +8
-2
lines changed
Expand file tree Collapse file tree 1 file changed +8
-2
lines changed Original file line number Diff line number Diff line change 99## Installing SparkR
1010
1111### Requirements
12- SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default uses the 1.1.0
13- candidate from the Apache staging repositories . You can also build SparkR against a
12+ SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default uses
13+ Apache Spark 1.1.0 . You can also build SparkR against a
1414different Spark version (>= 0.9.0) by modifying ` pkg/src/build.sbt ` .
1515
1616SparkR also requires the R package ` rJava ` to be installed. To install ` rJava ` ,
@@ -29,6 +29,12 @@ If you wish to try out the package directly from github, you can use [`install_g
2929 library(devtools)
3030 install_github("amplab-extras/SparkR-pkg", subdir="pkg")
3131
32+ SparkR by default uses Apache Spark 1.1.0. You can switch to a different Spark
33+ version by setting the environment variable ` SPARK_VERSION ` . For example, to
34+ use Apache Spark 1.2.0, you can run
35+
36+ SPARK_VERSION=1.2.0 ./install-dev.sh
37+
3238SparkR by default links to Hadoop 1.0.4. To use SparkR with other Hadoop
3339versions, you will need to rebuild SparkR with the same version that [ Spark is
3440linked
You can’t perform that action at this time.
0 commit comments