-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-16184][SPARKR] conf API for SparkSession #13885
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #61156 has finished for PR 13885 at commit
|
R/pkg/NAMESPACE
Outdated
| export("sparkR.init") | ||
| export("sparkR.stop") | ||
| export("sparkR.session.stop") | ||
| export("conf") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we call this sparkR.conf ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure - its dependency on sparkSession - i think it belongs to SQLContext.R, should it be sparkR.session.conf (because it is the session's RuntimeConfig) but it feels too long.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also this is readonly - setter is spark.session() - in line with SparkSession's builder syntax.
|
Test build #61192 has finished for PR 13885 at commit
|
|
Test build #61194 has finished for PR 13885 at commit
|
R/pkg/R/SQLContext.R
Outdated
| } else { | ||
| conf <- callJMethod(sparkSession, "conf") | ||
| value <- if (missing(defaultValue)) { | ||
| callJMethod(conf, "get", key) # throws if key not found |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of throwing a java exception, can we catch it and throw a stop from the SparkR code ? its slightly more user friendly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
|
Test build #61245 has finished for PR 13885 at commit
|
|
@shivaram this is ready, thanks! |
|
LGTM. Thanks @felixcheung - Merging this into master and branch-2.0 |
## What changes were proposed in this pull request? Add `conf` method to get Runtime Config from SparkSession ## How was this patch tested? unit tests, manual tests This is how it works in sparkR shell: ``` SparkSession available as 'spark'. > conf() $hive.metastore.warehouse.dir [1] "file:/opt/spark-2.0.0-bin-hadoop2.6/R/spark-warehouse" $spark.app.id [1] "local-1466749575523" $spark.app.name [1] "SparkR" $spark.driver.host [1] "10.0.2.1" $spark.driver.port [1] "45629" $spark.executorEnv.LD_LIBRARY_PATH [1] "$LD_LIBRARY_PATH:/usr/lib/R/lib:/usr/lib/x86_64-linux-gnu:/usr/lib/jvm/default-java/jre/lib/amd64/server" $spark.executor.id [1] "driver" $spark.home [1] "/opt/spark-2.0.0-bin-hadoop2.6" $spark.master [1] "local[*]" $spark.sql.catalogImplementation [1] "hive" $spark.submit.deployMode [1] "client" > conf("spark.master") $spark.master [1] "local[*]" ``` Author: Felix Cheung <[email protected]> Closes #13885 from felixcheung/rconf. (cherry picked from commit 30b182b) Signed-off-by: Shivaram Venkataraman <[email protected]>
What changes were proposed in this pull request?
Add
confmethod to get Runtime Config from SparkSessionHow was this patch tested?
unit tests, manual tests
This is how it works in sparkR shell: