Skip to content

Conversation

@felixcheung
Copy link
Member

@felixcheung felixcheung commented Jun 24, 2016

What changes were proposed in this pull request?

Add conf method to get Runtime Config from SparkSession

How was this patch tested?

unit tests, manual tests

This is how it works in sparkR shell:

 SparkSession available as 'spark'.
> conf()
$hive.metastore.warehouse.dir
[1] "file:/opt/spark-2.0.0-bin-hadoop2.6/R/spark-warehouse"

$spark.app.id
[1] "local-1466749575523"

$spark.app.name
[1] "SparkR"

$spark.driver.host
[1] "10.0.2.1"

$spark.driver.port
[1] "45629"

$spark.executorEnv.LD_LIBRARY_PATH
[1] "$LD_LIBRARY_PATH:/usr/lib/R/lib:/usr/lib/x86_64-linux-gnu:/usr/lib/jvm/default-java/jre/lib/amd64/server"

$spark.executor.id
[1] "driver"

$spark.home
[1] "/opt/spark-2.0.0-bin-hadoop2.6"

$spark.master
[1] "local[*]"

$spark.sql.catalogImplementation
[1] "hive"

$spark.submit.deployMode
[1] "client"

> conf("spark.master")
$spark.master
[1] "local[*]"

@felixcheung
Copy link
Member Author

@shivaram

@SparkQA
Copy link

SparkQA commented Jun 24, 2016

Test build #61156 has finished for PR 13885 at commit 2079018.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

R/pkg/NAMESPACE Outdated
export("sparkR.init")
export("sparkR.stop")
export("sparkR.session.stop")
export("conf")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we call this sparkR.conf ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure - its dependency on sparkSession - i think it belongs to SQLContext.R, should it be sparkR.session.conf (because it is the session's RuntimeConfig) but it feels too long.

Copy link
Member Author

@felixcheung felixcheung Jun 24, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also this is readonly - setter is spark.session() - in line with SparkSession's builder syntax.

@SparkQA
Copy link

SparkQA commented Jun 24, 2016

Test build #61192 has finished for PR 13885 at commit e9be83d.

  • This patch fails R style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 24, 2016

Test build #61194 has finished for PR 13885 at commit bdd290c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

} else {
conf <- callJMethod(sparkSession, "conf")
value <- if (missing(defaultValue)) {
callJMethod(conf, "get", key) # throws if key not found
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of throwing a java exception, can we catch it and throw a stop from the SparkR code ? its slightly more user friendly

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@SparkQA
Copy link

SparkQA commented Jun 26, 2016

Test build #61245 has finished for PR 13885 at commit 385645b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@felixcheung
Copy link
Member Author

@shivaram this is ready, thanks!

@shivaram
Copy link
Contributor

LGTM. Thanks @felixcheung - Merging this into master and branch-2.0

asfgit pushed a commit that referenced this pull request Jun 26, 2016
## What changes were proposed in this pull request?

Add `conf` method to get Runtime Config from SparkSession

## How was this patch tested?

unit tests, manual tests

This is how it works in sparkR shell:
```
 SparkSession available as 'spark'.
> conf()
$hive.metastore.warehouse.dir
[1] "file:/opt/spark-2.0.0-bin-hadoop2.6/R/spark-warehouse"

$spark.app.id
[1] "local-1466749575523"

$spark.app.name
[1] "SparkR"

$spark.driver.host
[1] "10.0.2.1"

$spark.driver.port
[1] "45629"

$spark.executorEnv.LD_LIBRARY_PATH
[1] "$LD_LIBRARY_PATH:/usr/lib/R/lib:/usr/lib/x86_64-linux-gnu:/usr/lib/jvm/default-java/jre/lib/amd64/server"

$spark.executor.id
[1] "driver"

$spark.home
[1] "/opt/spark-2.0.0-bin-hadoop2.6"

$spark.master
[1] "local[*]"

$spark.sql.catalogImplementation
[1] "hive"

$spark.submit.deployMode
[1] "client"

> conf("spark.master")
$spark.master
[1] "local[*]"

```

Author: Felix Cheung <[email protected]>

Closes #13885 from felixcheung/rconf.

(cherry picked from commit 30b182b)
Signed-off-by: Shivaram Venkataraman <[email protected]>
@asfgit asfgit closed this in 30b182b Jun 26, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants