Skip to content
Original file line number Diff line number Diff line change
Expand Up @@ -79,16 +79,17 @@ case class SetCommand(kv: Option[(String, Option[String])]) extends RunnableComm
// Queries all key-value pairs that are set in the SQLConf of the sparkSession.
case None =>
val runFunc = (sparkSession: SparkSession) => {
sparkSession.conf.getAll.map { case (k, v) => Row(k, v) }.toSeq
sparkSession.conf.getAll.toSeq.sorted.map { case (k, v) => Row(k, v) }
}
(keyValueOutput, runFunc)

// Queries all properties along with their default values and docs that are defined in the
// SQLConf of the sparkSession.
case Some(("-v", None)) =>
val runFunc = (sparkSession: SparkSession) => {
sparkSession.sessionState.conf.getAllDefinedConfs.map { case (key, defaultValue, doc) =>
Row(key, defaultValue, doc)
sparkSession.sessionState.conf.getAllDefinedConfs.sorted.map {
case (key, defaultValue, doc) =>
Row(key, if (defaultValue == null) "null" else defaultValue, doc)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the root cause to raise exceptions during decoding.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When the default is null, we are using <undefined>

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jan 22, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's easy to change to that, but that seems to be for Option type config.
I thought it means a default value, null.
Is it better to use the same <undefined> here, too?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so. We did it here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see. Thank you. I'll update like that.

}
}
val schema = StructType(
Expand Down
18 changes: 18 additions & 0 deletions sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -982,6 +982,24 @@ class SQLQuerySuite extends QueryTest with SharedSQLContext {
spark.sessionState.conf.clear()
}

test("SET commands should return a list sorted by key") {
val overrideConfs = sql("SET").collect()
sql(s"SET test.key3=1")
sql(s"SET test.key2=2")
sql(s"SET test.key1=3")
val result = sql("SET").collect()
assert(result ===
(overrideConfs ++ Seq(
Row("test.key1", "3"),
Row("test.key2", "2"),
Row("test.key3", "1"))).sortBy(_.getString(0))
)

val result2 = sql("SET -v").collect()
assert(result2 === result2.sortBy(_.getString(0)))
spark.sessionState.conf.clear()
Copy link
Member

@gatorsmile gatorsmile Jan 23, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will not drop the spark.test. We need to introduce a function into SQLConf for testing only

  // For testing only
  private[sql] def unregister(entry: ConfigEntry[_]): Unit = sqlConfEntries.synchronized {
    sqlConfEntries.remove(entry.key)
  }

Create a separate test case; then call the following code in a finally block: SQLConf.unregister(confEntry)

Will be back after a few hours.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1. Thank you, @gatorsmile .

}

test("SET commands with illegal or inappropriate argument") {
spark.sessionState.conf.clear()
// Set negative mapred.reduce.tasks for automatically determining
Expand Down