Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions R/pkg/R/DataFrame.R
Original file line number Diff line number Diff line change
Expand Up @@ -1903,7 +1903,7 @@ setMethod("except",
#' }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you add the default to this doc line above:

 #' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung Is it necessary to do that ? Because I notice there's no doc for the default value in other R APIs. Actually user can see the default value in the function signature which is also in the R doc.

## S4 method for signature 'DataFrame,character'
write.df(df, path, source = NULL,
  mode = "error", ...)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. I think that's just good practice, but it's fine both ways.

setMethod("write.df",
signature(df = "DataFrame", path = "character"),
function(df, path, source = NULL, mode = "append", ...){
function(df, path, source = NULL, mode = "error", ...){
if (is.null(source)) {
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
Expand All @@ -1928,7 +1928,7 @@ setMethod("write.df",
#' @export
setMethod("saveDF",
signature(df = "DataFrame", path = "character"),
function(df, path, source = NULL, mode = "append", ...){
function(df, path, source = NULL, mode = "error", ...){
write.df(df, path, source, mode, ...)
})

Expand Down Expand Up @@ -1968,7 +1968,7 @@ setMethod("saveDF",
setMethod("saveAsTable",
signature(df = "DataFrame", tableName = "character", source = "character",
mode = "character"),
function(df, tableName, source = NULL, mode="append", ...){
function(df, tableName, source = NULL, mode="error", ...){
if (is.null(source)) {
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
Expand Down