Skip to content
Closed
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,8 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM

for (enrichErrorEnabled <- Seq(false, true)) {
test(s"cause exception - ${enrichErrorEnabled}") {
withSQLConf("spark.sql.connect.enrichError.enabled" -> enrichErrorEnabled.toString) {
withSQLConf("spark.sql.connect.enrichError.enabled" -> enrichErrorEnabled.toString,
"spark.sql.legacy.timeParserPolicy" -> "EXCEPTION") {
val ex = intercept[SparkUpgradeException] {
spark
.sql("""
Expand Down
2 changes: 2 additions & 0 deletions docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,8 @@ license: |
- Since Spark 4.0, MySQL JDBC datasource will read FLOAT as FloatType, while in Spark 3.5 and previous, it was read as DoubleType. To restore the previous behavior, you can cast the column to the old type.
- Since Spark 4.0, MySQL JDBC datasource will read BIT(n > 1) as BinaryType, while in Spark 3.5 and previous, read as LongType. To restore the previous behavior, set `spark.sql.legacy.mysql.bitArrayMapping.enabled` to `true`.
- Since Spark 4.0, MySQL JDBC datasource will write ShortType as SMALLINT, while in Spark 3.5 and previous, write as INTEGER. To restore the previous behavior, you can replace the column with IntegerType whenever before writing.
- Since Spark 4.0, The default value for `spark.sql.legacy.ctePrecedencePolicy` has been changed from `EXCEPTION` to `CORRECTED`. Instead of raising an error, inner CTE definitions take precedence over outer definitions.
- Since Spark 4.0, The default value for `spark.sql.legacy.timeParserPolicy` has been changed from `EXCEPTION` to `CORRECTED`. Instead of raising an error, more values will be accepted for certain [Datetime Patterns for Formatting and Parsing](sql-ref-datetime-pattern.html) by default.

## Upgrading from Spark SQL 3.5.1 to 3.5.2

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4016,13 +4016,13 @@ object SQLConf {
.doc("When LEGACY, java.text.SimpleDateFormat is used for formatting and parsing " +
"dates/timestamps in a locale-sensitive manner, which is the approach before Spark 3.0. " +
"When set to CORRECTED, classes from java.time.* packages are used for the same purpose. " +
"The default value is EXCEPTION, RuntimeException is thrown when we will get different " +
"results.")
"When set to EXCEPTION, RuntimeException is thrown when we will get different " +
"results. The default is CORRECTED.")
.version("3.0.0")
.stringConf
.transform(_.toUpperCase(Locale.ROOT))
.checkValues(LegacyBehaviorPolicy.values.map(_.toString))
.createWithDefault(LegacyBehaviorPolicy.EXCEPTION.toString)
.createWithDefault(LegacyBehaviorPolicy.CORRECTED.toString)

val LEGACY_ARRAY_EXISTS_FOLLOWS_THREE_VALUED_LOGIC =
buildConf("spark.sql.legacy.followThreeValuedLogicInArrayExists")
Expand Down
Loading