Skip to content

Conversation

@felixcheung
Copy link
Member

@felixcheung felixcheung commented Nov 30, 2016

What changes were proposed in this pull request?

If SparkR is running as a package and it has previously downloaded Spark Jar it should be able to run as before without having to set SPARK_HOME. Basically with this bug the auto install Spark will only work in the first session.

This seems to be a regression on the earlier behavior.

Fix is to always try to install or check for the cached Spark if running in an interactive session.
As discussed before, we should probably only install Spark iff running in an interactive session (R shell, RStudio etc)

How was this patch tested?

Manually

@felixcheung
Copy link
Member Author

@yanboliang @shivaram

@SparkQA
Copy link

SparkQA commented Nov 30, 2016

Test build #69386 has finished for PR 16077 at commit 866727d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yanboliang
Copy link
Contributor

Is this means that it will install spark package each time when starting a new session in interactive mode? Thanks.

@felixcheung
Copy link
Member Author

felixcheung commented Nov 30, 2016 via email

@yanboliang
Copy link
Contributor

LGTM

@shivaram
Copy link
Contributor

shivaram commented Dec 4, 2016

Just to clarify this limits the auto-install feature to SparkR running on an interactive shell -- Is that right ? I think thats mostly fine as its the use-case we were targeting but it might good to check our documentation and update it appropriately.

@felixcheung
Copy link
Member Author

Right, I was just reviewing possible code paths for this in the last few days and I'm pretty confident that this change will not run install.spark in cluster modes (which would have Spark/JVM already running).

Also you are right, I just found that we didn't really talk about how install.spark would be called in sparkR.session() - I'll add that.

@SparkQA
Copy link

SparkQA commented Dec 5, 2016

Test build #69654 has finished for PR 16077 at commit 98f3250.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

shivaram commented Dec 5, 2016

LGTM. Merging this to master, branch-2.1

asfgit pushed a commit that referenced this pull request Dec 5, 2016
… a package without Spark

## What changes were proposed in this pull request?

If SparkR is running as a package and it has previously downloaded Spark Jar it should be able to run as before without having to set SPARK_HOME. Basically with this bug the auto install Spark will only work in the first session.

This seems to be a regression on the earlier behavior.

Fix is to always try to install or check for the cached Spark if running in an interactive session.
As discussed before, we should probably only install Spark iff running in an interactive session (R shell, RStudio etc)

## How was this patch tested?

Manually

Author: Felix Cheung <[email protected]>

Closes #16077 from felixcheung/rsessioninteractive.

(cherry picked from commit b019b3a)
Signed-off-by: Shivaram Venkataraman <[email protected]>
@asfgit asfgit closed this in b019b3a Dec 5, 2016
robert3005 pushed a commit to palantir/spark that referenced this pull request Dec 15, 2016
… a package without Spark

## What changes were proposed in this pull request?

If SparkR is running as a package and it has previously downloaded Spark Jar it should be able to run as before without having to set SPARK_HOME. Basically with this bug the auto install Spark will only work in the first session.

This seems to be a regression on the earlier behavior.

Fix is to always try to install or check for the cached Spark if running in an interactive session.
As discussed before, we should probably only install Spark iff running in an interactive session (R shell, RStudio etc)

## How was this patch tested?

Manually

Author: Felix Cheung <[email protected]>

Closes apache#16077 from felixcheung/rsessioninteractive.
uzadude pushed a commit to uzadude/spark that referenced this pull request Jan 27, 2017
… a package without Spark

## What changes were proposed in this pull request?

If SparkR is running as a package and it has previously downloaded Spark Jar it should be able to run as before without having to set SPARK_HOME. Basically with this bug the auto install Spark will only work in the first session.

This seems to be a regression on the earlier behavior.

Fix is to always try to install or check for the cached Spark if running in an interactive session.
As discussed before, we should probably only install Spark iff running in an interactive session (R shell, RStudio etc)

## How was this patch tested?

Manually

Author: Felix Cheung <[email protected]>

Closes apache#16077 from felixcheung/rsessioninteractive.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants