-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-31867][SQL][FOLLOWUP] Check result differences for datetime formatting #28736
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #123560 has finished for PR 28736 at commit
|
|
cc @cloud-fan @maropu @MaxGekk thanks |
|
Test build #123562 has finished for PR 28736 at commit
|
|
thanks, merging to master/3.0! |
…rmatting ### What changes were proposed in this pull request? In this PR, we throw `SparkUpgradeException` when getting `DateTimeException` for datetime formatting in the `EXCEPTION` legacy Time Parser Policy. ### Why are the changes needed? `DateTimeException` is also declared by `java.time.format.DateTimeFormatter#format`, but in Spark, it can barely occur. We have suspected one that due to a JDK bug so far. see https://bugs.openjdk.java.net/browse/JDK-8079628. For `from_unixtime` function, we will suppress the DateTimeException caused by `DD` and result `NULL`. It is a silent date change that should be avoided in Java 8. ### Does this PR introduce _any_ user-facing change? Yes, when running on Java8 and using `from_unixtime` function with pattern `DD` to format datetimes, if dayofyear>=100, `SparkUpgradeException` will alert users instead of silently resulting null. For `date_format`, `SparkUpgradeException` take the palace of `DateTimeException`. ### How was this patch tested? add unit tests. Closes #28736 from yaooqinn/SPARK-31867-F. Authored-by: Kent Yao <[email protected]> Signed-off-by: Wenchen Fan <[email protected]> (cherry picked from commit fc6af9d) Signed-off-by: Wenchen Fan <[email protected]>
| } catch { | ||
| case _: Throwable => throw e | ||
| } | ||
| throw new SparkUpgradeException("3.0", s"Fail to format it to '$resultCandidate' in the new" + |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HyukjinKwon Should we combine the JVM stacktrace for SparkUpgradeException in the python side?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1!
…erSuite ### What changes were proposed in this pull request? This PR removes the legacy workaround for JDK 8 in #28736. ### Why are the changes needed? - We still need the main code for completeness, and in case there are other diff in the future JDK versions so this PR only fixes the tests. - We dropped JDK 11/8 at SPARK-44112 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Fixed unittests. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #43085 from HyukjinKwon/SPARK-45300. Authored-by: Hyukjin Kwon <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
In this PR, we throw
SparkUpgradeExceptionwhen gettingDateTimeExceptionfor datetime formatting in theEXCEPTIONlegacy Time Parser Policy.Why are the changes needed?
DateTimeExceptionis also declared byjava.time.format.DateTimeFormatter#format, but in Spark, it can barely occur. We have suspected one that due to a JDK bug so far. see https://bugs.openjdk.java.net/browse/JDK-8079628.For
from_unixtimefunction, we will suppress the DateTimeException caused byDDand resultNULL. It is a silent date change that should be avoided in Java 8.Does this PR introduce any user-facing change?
Yes, when running on Java8 and using
from_unixtimefunction with patternDDto format datetimes, if dayofyear>=100,SparkUpgradeExceptionwill alert users instead of silently resulting null. Fordate_format,SparkUpgradeExceptiontake the palace ofDateTimeException.How was this patch tested?
add unit tests.