-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: skip negative scale checks for creating decimals #723
Conversation
@@ -30,7 +30,7 @@ public class CometDictionary implements AutoCloseable { | |||
private final int numValues; | |||
|
|||
/** Decoded dictionary values. We only need to copy values for decimal type. */ | |||
private ByteArrayWrapper[] binaries; | |||
private volatile ByteArrayWrapper[] binaries; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not directly related but forgot to add based on #705 (comment)
} | ||
} | ||
|
||
/** This method skips the negative scale check, otherwise the same as Decimal.createUnsafe(). */ | ||
private Decimal createDecimal(long unscaled, int precision, int scale) { | ||
Decimal dec = new Decimal(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems the check for negative scale is a feature for ANSI correctness? https://issues.apache.org/jira/browse/SPARK-30252
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does not need to do the check every read, we are running ANSI tests in spark_sql_test_ansi.yml
/** This method skips a few checks, otherwise the same as Decimal.apply(). */ | ||
private Decimal createDecimal(BigDecimal value, int precision, int scale) { | ||
Decimal dec = new Decimal(); | ||
Platform.putObjectVolatile(dec, decimalValOffset, new scala.math.BigDecimal(value)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spark version has some additional checks for rounding and precision. Do we not need those?
this.decimalVal = decimal.setScale(scale, ROUND_HALF_UP)
if (decimalVal.precision > precision) {
throw QueryExecutionErrors.decimalPrecisionExceedsMaxPrecisionError(
decimalVal.precision, precision)
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No we do not need the precision check because it is done in advance
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just for my knowledge, where?
@andygrove @comphead @huaxingao @parthchandra @viirya |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
/** This method skips a few checks, otherwise the same as Decimal.apply(). */ | ||
private Decimal createDecimal(BigDecimal value, int precision, int scale) { | ||
Decimal dec = new Decimal(); | ||
Platform.putObjectVolatile(dec, decimalValOffset, new scala.math.BigDecimal(value)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just for my knowledge, where?
(cherry picked from commit fbe86d0)
Which issue does this PR close?
Part of #679 and #670
Rationale for this change
This PR improves the getDecimal performance
What changes are included in this PR?
The new
createDecimal
method skips the negative scale checkHow are these changes tested?
Existing tests