Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
rebase PR and take spark.master over master
  • Loading branch information
zjffdu committed Sep 20, 2016
commit c91d02a95d8239db5d2d4db7a796a987705a449d
24 changes: 0 additions & 24 deletions R/pkg/R/sparkR.R
Original file line number Diff line number Diff line change
Expand Up @@ -368,30 +368,6 @@ sparkR.session <- function(
}
overrideEnvs(sparkConfigMap, paramMap)
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit confused. To be concrete I think we were talking about something like:

    sparkConfigMap <- convertNamedListToEnv(sparkConfig)
    namedParams <- list(...)
    if (length(namedParams) > 0) {
      paramMap <- convertNamedListToEnv(namedParams)
     # Override for certain named parameters
     if (exists("spark.master", envir = paramMap)) {
       master <- paramMap[["spark.master"]]
     }
     if (exists("spark.app.name", envir = paramMap)) {
       appName <- paramMap[["spark.app.name"]]
     }
      overrideEnvs(sparkConfigMap, paramMap)
    }
   if (nzchar(master)) {
     sparkConfigMap[["spark.master"]] <- master
   }

<<<<<<< 8f0c35a4d0dd458719627be5f524792bf244d70a
=======
if (nzchar(master)) {
sparkConfigMap[["spark.master"]] <- master
}

# do not download if it is run in the sparkR shell
if (!nzchar(master) || is_master_local(master)) {
if (!is_sparkR_shell()) {
if (is.na(file.info(sparkHome)$isdir)) {
msg <- paste0("Spark not found in SPARK_HOME: ",
sparkHome,
" .\nTo search in the cache directory. ",
"Installation will start if not found.")
message(msg)
packageLocalDir <- install.spark()
sparkHome <- packageLocalDir
} else {
msg <- paste0("Spark package is found in SPARK_HOME: ", sparkHome)
message(msg)
}
}
}
>>>>>>> [SPARK-17210][SPARKR] sparkr.zip is not distributed to executors when run sparkr in RStudio

if (!exists(".sparkRjsc", envir = .sparkREnv)) {
retHome <- sparkCheckInstall(sparkHome, master)
Expand Down