-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-40387][SQL] Improve the implementation of Spark Decimal #37830
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The failure test case is not related to this PR. |
|
I don't think |
OK |
| } else { | ||
| BigDecimal(longVal, _scale) | ||
| } | ||
| def toBigDecimal: BigDecimal = if (decimalVal.ne(null)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we avoid pure code style change? the previous code style was not wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
| def % (that: Decimal): Decimal = | ||
| if (that.isZero) null | ||
| else Decimal(toJavaBigDecimal.remainder(that.toJavaBigDecimal, MATH_CONTEXT)) | ||
| if (that.isZero) null else Decimal(toJavaBigDecimal.remainder(that.toJavaBigDecimal, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's avoid pure code style changes.
| } | ||
|
|
||
| def abs: Decimal = if (this.compare(Decimal.ZERO) < 0) this.unary_- else this | ||
| def abs: Decimal = if (this < Decimal.ZERO) this.unary_- else this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this have a real impact?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this < Decimal.ZERO is more clear.
|
|
||
| def toFloat: Float = toBigDecimal.floatValue | ||
|
|
||
| private def rawLongValue: Long = longVal / POW_10(_scale) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's still use the old name actualLongVal
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
|
thanks, merging to master! |
|
@cloud-fan Thank you ! |
### What changes were proposed in this pull request? This PR used to improve the implementation of Spark `Decimal`. The improvement points are as follows: 1. Use `toJavaBigDecimal` instead of `toBigDecimal.bigDecimal` 2. Extract `longVal / POW_10(_scale)` as a new method `def actualLongVal: Long` 3. Remove `BIG_DEC_ZERO` and use `decimalVal.signum` to judge whether or not equals zero. 4. Use `<` instead of `compare`. 5. Correct some code style. ### Why are the changes needed? Improve the implementation of Spark Decimal ### Does this PR introduce _any_ user-facing change? 'No'. Just update the inner implementation. ### How was this patch tested? N/A Closes apache#37830 from beliefer/SPARK-40387. Authored-by: Jiaan Geng <[email protected]> Signed-off-by: Wenchen Fan <[email protected]>
What changes were proposed in this pull request?
This PR used to improve the implementation of Spark
Decimal. The improvement points are as follows:toJavaBigDecimalinstead oftoBigDecimal.bigDecimallongVal / POW_10(_scale)as a new methoddef actualLongVal: LongBIG_DEC_ZEROand usedecimalVal.signumto judge whether or not equals zero.<instead ofcompare.Why are the changes needed?
Improve the implementation of Spark Decimal
Does this PR introduce any user-facing change?
'No'.
Just update the inner implementation.
How was this patch tested?
N/A