Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/core-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ license: |
## Upgrading from Core 2.4 to 3.0

- The `org.apache.spark.ExecutorPlugin` interface and related configuration has been replaced with
`org.apache.spark.plugin.SparkPlugin`, which adds new functionality. Plugins using the old
`org.apache.spark.api.plugin.SparkPlugin`, which adds new functionality. Plugins using the old
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a typo fix.

interface must be modified to extend the new interfaces. Check the
[Monitoring](monitoring.html) guide for more details.

Expand Down
4 changes: 4 additions & 0 deletions docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,10 @@ license: |

* The decimal string representation can be different between Hive 1.2 and Hive 2.3 when using `TRANSFORM` operator in SQL for script transformation, which depends on hive's behavior. In Hive 1.2, the string representation omits trailing zeroes. But in Hive 2.3, it is always padded to 18 digits with trailing zeroes if necessary.

## Upgrading from Spark SQL 2.4.5 to 2.4.6

- In Spark 2.4.6, the `RESET` command does not reset the static SQL configuration values to the default. It only clears the runtime SQL configuration values.

## Upgrading from Spark SQL 2.4.4 to 2.4.5

- Since Spark 2.4.5, `TRUNCATE TABLE` command tries to set back original permission and ACLs during re-creating the table/partition paths. To restore the behaviour of earlier versions, set `spark.sql.truncateTable.ignorePermissionAcl.enabled` to `true`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,9 @@ object StaticSQLConf {
.internal()
.version("2.1.0")
.stringConf
// System preserved database should not exists in metastore. However it's hard to guarantee it
// for every session, because case-sensitivity differs. Here we always lowercase it to make our
// life easier.
.transform(_.toLowerCase(Locale.ROOT))
.createWithDefault("global_temp")

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,9 +153,6 @@ private[sql] class SharedState(
* A manager for global temporary views.
*/
lazy val globalTempViewManager: GlobalTempViewManager = {
// System preserved database should not exists in metastore. However it's hard to guarantee it
// for every session, because case-sensitivity differs. Here we always lowercase it to make our
// life easier.
val globalTempDB = conf.get(GLOBAL_TEMP_DATABASE)
if (externalCatalog.databaseExists(globalTempDB)) {
throw new SparkException(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ import org.scalatest.BeforeAndAfterEach
import org.apache.spark.{SparkConf, SparkContext, SparkFunSuite}
import org.apache.spark.internal.config.UI.UI_ENABLED
import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.sql.internal.StaticSQLConf.GLOBAL_TEMP_DATABASE

/**
* Test cases for the builder pattern of [[SparkSession]].
Expand Down Expand Up @@ -152,4 +153,19 @@ class SparkSessionBuilderSuite extends SparkFunSuite with BeforeAndAfterEach {
session.sparkContext.hadoopConfiguration.unset(mySpecialKey)
}
}

test("SPARK-31234: RESET command will not change static sql configs and " +
"spark context conf values in SessionState") {
val session = SparkSession.builder()
.master("local")
.config(GLOBAL_TEMP_DATABASE.key, value = "globalTempDB-SPARK-31234")
.config("spark.app.name", "test-app-SPARK-31234")
.getOrCreate()

assert(session.sessionState.conf.getConfString("spark.app.name") === "test-app-SPARK-31234")
assert(session.sessionState.conf.getConf(GLOBAL_TEMP_DATABASE) === "globaltempdb-spark-31234")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This difference between Spark 2.4 and Spark 3.0 is caused by #24979

session.sql("RESET")
assert(session.sessionState.conf.getConfString("spark.app.name") === "test-app-SPARK-31234")
assert(session.sessionState.conf.getConf(GLOBAL_TEMP_DATABASE) === "globaltempdb-spark-31234")
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ class SQLConfSuite extends QueryTest with SharedSparkSession {
}
}

test("reset will not change static sql configs and spark core configs") {
test("SPARK-31234: reset will not change static sql configs and spark core configs") {
val conf = spark.sparkContext.getConf.getAll.toMap
val appName = conf.get("spark.app.name")
val driverHost = conf.get("spark.driver.host")
Expand Down