Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
73 commits
Select commit Hold shift + click to select a range
e14b545
[SPARK-7977] [BUILD] Disallowing println
jonalter Jul 10, 2015
11e22b7
[SPARK-7944] [SPARK-8013] Remove most of the Spark REPL fork for Scal…
dragos Jul 10, 2015
5dd45bd
[SPARK-8958] Dynamic allocation: change cached timeout to infinity
Jul 10, 2015
db6d57f
[CORE] [MINOR] change the log level to info
chenghao-intel Jul 10, 2015
c185f3a
[SPARK-8675] Executors created by LocalBackend won't get the same cla…
coderplay Jul 10, 2015
05ac023
[HOTFIX] fix flaky test in PySpark SQL
Jul 10, 2015
0772026
[SPARK-8923] [DOCUMENTATION, MLLIB] Add @since tags to mllib.fpm
rahulpalamuttam Jul 10, 2015
fb8807c
[SPARK-7078] [SPARK-7079] Binary processing sort for Spark SQL
JoshRosen Jul 10, 2015
857e325
[SPARK-8990] [SQL] SPARK-8990 DataFrameReader.parquet() should respec…
liancheng Jul 10, 2015
b6fc0ad
add inline comment for python tests
davies Jul 11, 2015
3363088
[SPARK-8961] [SQL] Makes BaseWriterContainer.outputWriterForRow accep…
liancheng Jul 11, 2015
6e1c7e2
[SPARK-7735] [PYSPARK] Raise Exception on non-zero exit from pipe com…
megatron-me-uk Jul 11, 2015
9c50757
[SPARK-8598] [MLLIB] Implementation of 1-sample, two-sided, Kolmogoro…
Jul 11, 2015
7f6be1f
[SPARK-6487] [MLLIB] Add sequential pattern mining algorithm PrefixSp…
zhangjiajin Jul 11, 2015
0c5207c
[SPARK-8994] [ML] tiny cleanups to Params, Pipeline
jkbradley Jul 11, 2015
c472eb1
[SPARK-8970][SQL] remove unnecessary abstraction for ExtractValue
cloud-fan Jul 11, 2015
3009088
[SPARK-8880] Fix confusing Stage.attemptId member variable
kayousterhout Jul 13, 2015
20b4743
[SPARK-9006] [PYSPARK] fix microsecond loss in Python 3
Jul 13, 2015
92540d2
[SPARK-8203] [SPARK-8204] [SQL] conditional function: least/greatest
adrian-wang Jul 13, 2015
6b89943
[SPARK-8944][SQL] Support casting between IntervalType and StringType
cloud-fan Jul 13, 2015
a5bc803
[SPARK-8596] Add module for rstudio link to spark
koaning Jul 13, 2015
7f487c8
[SPARK-6797] [SPARKR] Add support for YARN cluster mode.
Jul 13, 2015
9b62e93
[SPARK-8706] [PYSPARK] [PROJECT INFRA] Add pylint checks to PySpark
MechCoder Jul 13, 2015
5ca26fb
[SPARK-8950] [WEBUI] Correct the calculation of SchedulerDelay in Sta…
carsonwang Jul 13, 2015
79c3582
Revert "[SPARK-8706] [PYSPARK] [PROJECT INFRA] Add pylint checks to P…
davies Jul 13, 2015
5c41691
[SPARK-8954] [BUILD] Remove unneeded deb repository from Dockerfile t…
yongtang Jul 13, 2015
714fc55
[SPARK-8991] [ML] Update SharedParamsCodeGen's Generated Documentation
Jul 13, 2015
4c797f2
[SPARK-8636] [SQL] Fix equalNullSafe comparison
Jul 13, 2015
0aed38e
[SPARK-8533] [STREAMING] Upgrade Flume to 1.6.0
harishreedharan Jul 13, 2015
b7bcbe2
[SPARK-8743] [STREAMING] Deregister Codahale metrics for streaming wh…
Jul 13, 2015
408b384
[SPARK-6910] [SQL] Support for pushing predicates down to metastore f…
Jul 14, 2015
20c1434
[SPARK-9001] Fixing errors in javadocs that lead to failed build/sbt doc
jegonzal Jul 14, 2015
c1feebd
[SPARK-9010] [DOCUMENTATION] Improve the Spark Configuration document…
stanzhai Jul 14, 2015
257236c
[SPARK-6851] [SQL] function least/greatest follow up
adrian-wang Jul 14, 2015
59d820a
[SPARK-9029] [SQL] shortcut CaseKeyWhen if key is null
cloud-fan Jul 14, 2015
37f2d96
[SPARK-9027] [SQL] Generalize metastore predicate pushdown
marmbrus Jul 14, 2015
c4e98ff
[SPARK-8933] [BUILD] Provide a --force flag to build/mvn that always …
Jul 14, 2015
8fb3a65
[SPARK-8911] Fix local mode endless heartbeats
Jul 14, 2015
d267c28
[SPARK-9031] Merge BlockObjectWriter and DiskBlockObject writer to re…
JoshRosen Jul 14, 2015
0a4071e
[SPARK-8718] [GRAPHX] Improve EdgePartition2D for non perfect square …
aray Jul 14, 2015
fb1d06f
[SPARK-4072] [CORE] Display Streaming blocks in Streaming UI
zsxwing Jul 14, 2015
4b5cfc9
[SPARK-8800] [SQL] Fix inaccurate precision/scale of Decimal division…
viirya Jul 14, 2015
740b034
[SPARK-4362] [MLLIB] Make prediction probability available in NaiveBa…
srowen Jul 14, 2015
11e5c37
[SPARK-8962] Add Scalastyle rule to ban direct use of Class.forName; …
JoshRosen Jul 14, 2015
e965a79
[SPARK-9045] Fix Scala 2.11 build break in UnsafeExternalRowSorter
JoshRosen Jul 15, 2015
cc57d70
[SPARK-9050] [SQL] Remove unused newOrdering argument from Exchange (…
JoshRosen Jul 15, 2015
f957796
[SPARK-8820] [STREAMING] Add a configuration to set checkpoint dir.
SaintBacchus Jul 15, 2015
bb870e7
[SPARK-5523] [CORE] [STREAMING] Add a cache for hostname in TaskMetri…
jerryshao Jul 15, 2015
5572fd0
[HOTFIX] Adding new names to known contributors
pwendell Jul 15, 2015
f650a00
[SPARK-8808] [SPARKR] Fix assignments in SparkR.
Jul 15, 2015
f23a721
[SPARK-8993][SQL] More comprehensive type checking in expressions.
rxin Jul 15, 2015
c6b1a9e
Revert SPARK-6910 and SPARK-9027
marmbrus Jul 15, 2015
4692769
[SPARK-6259] [MLLIB] Python API for LDA
yu-iskw Jul 15, 2015
3f6296f
[SPARK-8018] [MLLIB] KMeans should accept initial cluster centers as …
FlytxtRnD Jul 15, 2015
f0e1297
[SPARK-8279][SQL]Add math function round
yjshen Jul 15, 2015
1bb8acc
[SPARK-8997] [MLLIB] Performance improvements in LocalPrefixSpan
Jul 15, 2015
14935d8
[HOTFIX][SQL] Unit test breaking.
rxin Jul 15, 2015
adb33d3
[SPARK-9012] [WEBUI] Escape Accumulators in the task table
zsxwing Jul 15, 2015
20bb10f
[SPARK-8706] [PYSPARK] [PROJECT INFRA] Add pylint checks to PySpark
MechCoder Jul 15, 2015
6f69025
[SPARK-8840] [SPARKR] Add float coercion on SparkR
viirya Jul 15, 2015
fa4ec36
[SPARK-9020][SQL] Support mutable state in code gen expressions
cloud-fan Jul 15, 2015
a938527
[SPARK-8221][SQL]Add pmod function
zhichao-li Jul 15, 2015
9716a72
[Minor][SQL] Allow spaces in the beginning and ending of string for I…
viirya Jul 15, 2015
303c120
[SPARK-7555] [DOCS] Add doc for elastic net in ml-guide and mllib-guide
coderxiang Jul 15, 2015
ec9b621
SPARK-9070 JavaDataFrameSuite teardown NPEs if setup failed
steveloughran Jul 15, 2015
536533c
[SPARK-9005] [MLLIB] Fix RegressionMetrics computation of explainedVa…
Jul 15, 2015
b9a922e
[SPARK-6602][Core]Replace Akka Serialization with Spark Serializer
zsxwing Jul 15, 2015
674eb2a
[SPARK-8974] Catch exceptions in allocation schedule task.
Jul 15, 2015
affbe32
[SPARK-9071][SQL] MonotonicallyIncreasingID and SparkPartitionID shou…
rxin Jul 15, 2015
5599cc4
Predicate pushdown to hive metastore
Jul 15, 2015
b3cb5af
Synchronize getPartitionsByFilter
Jul 17, 2015
acf96d1
Synchronize on hive client
Jul 17, 2015
f897087
Synchronize on this
Jul 17, 2015
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
[SPARK-8279][SQL]Add math function round
JIRA: https://issues.apache.org/jira/browse/SPARK-8279

Author: Yijie Shen <[email protected]>

Closes apache#6938 from yijieshen/udf_round_3 and squashes the following commits:

07a124c [Yijie Shen] remove useless def children
392b65b [Yijie Shen] add negative scale test in DecimalSuite
61760ee [Yijie Shen] address reviews
302a78a [Yijie Shen] Add dataframe function test
31dfe7c [Yijie Shen] refactor round to make it readable
8c7a949 [Yijie Shen] rebase & inputTypes update
9555e35 [Yijie Shen] tiny style fix
d10be4a [Yijie Shen] use TypeCollection to specify wanted input and implicit cast
c3b9839 [Yijie Shen] rely on implict cast to handle string input
b0bff79 [Yijie Shen] make round's inner method's name more meaningful
9bd6930 [Yijie Shen] revert accidental change
e6f44c4 [Yijie Shen] refactor eval and genCode
1b87540 [Yijie Shen] modify checkInputDataTypes using foldable
5486b2d [Yijie Shen] DataFrame API modification
2077888 [Yijie Shen] codegen versioned eval
6cd9a64 [Yijie Shen] refactor Round's constructor
9be894e [Yijie Shen] add round functions in o.a.s.sql.functions
7c83e13 [Yijie Shen] more tests on round
56db4bb [Yijie Shen] Add decimal support to Round
7e163ae [Yijie Shen] style fix
653d047 [Yijie Shen] Add math function round
  • Loading branch information
yjshen authored and rxin committed Jul 15, 2015
commit f0e129740dc2442a21dfa7fbd97360df87291095
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ object FunctionRegistry {
expression[Pow]("power"),
expression[UnaryPositive]("positive"),
expression[Rint]("rint"),
expression[Round]("round"),
expression[ShiftLeft]("shiftleft"),
expression[ShiftRight]("shiftright"),
expression[ShiftRightUnsigned]("shiftrightunsigned"),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,10 @@ package org.apache.spark.sql.catalyst.expressions

import java.{lang => jl}

import org.apache.spark.sql.catalyst.InternalRow
import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
import org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckSuccess, TypeCheckFailure}
import org.apache.spark.sql.catalyst.expressions.codegen._
import org.apache.spark.sql.catalyst.InternalRow
import org.apache.spark.sql.types._
import org.apache.spark.unsafe.types.UTF8String

Expand Down Expand Up @@ -520,3 +522,202 @@ case class Logarithm(left: Expression, right: Expression)
"""
}
}

/**
* Round the `child`'s result to `scale` decimal place when `scale` >= 0
* or round at integral part when `scale` < 0.
* For example, round(31.415, 2) would eval to 31.42 and round(31.415, -1) would eval to 30.
*
* Child of IntegralType would eval to itself when `scale` >= 0.
* Child of FractionalType whose value is NaN or Infinite would always eval to itself.
*
* Round's dataType would always equal to `child`'s dataType except for [[DecimalType.Fixed]],
* which leads to scale update in DecimalType's [[PrecisionInfo]]
*
* @param child expr to be round, all [[NumericType]] is allowed as Input
* @param scale new scale to be round to, this should be a constant int at runtime
*/
case class Round(child: Expression, scale: Expression)
extends BinaryExpression with ExpectsInputTypes {

import BigDecimal.RoundingMode.HALF_UP

def this(child: Expression) = this(child, Literal(0))

override def left: Expression = child
override def right: Expression = scale

// round of Decimal would eval to null if it fails to `changePrecision`
override def nullable: Boolean = true

override def foldable: Boolean = child.foldable

override lazy val dataType: DataType = child.dataType match {
// if the new scale is bigger which means we are scaling up,
// keep the original scale as `Decimal` does
case DecimalType.Fixed(p, s) => DecimalType(p, if (_scale > s) s else _scale)
case t => t
}

override def inputTypes: Seq[AbstractDataType] = Seq(NumericType, IntegerType)

override def checkInputDataTypes(): TypeCheckResult = {
super.checkInputDataTypes() match {
case TypeCheckSuccess =>
if (scale.foldable) {
TypeCheckSuccess
} else {
TypeCheckFailure("Only foldable Expression is allowed for scale arguments")
}
case f => f
}
}

// Avoid repeated evaluation since `scale` is a constant int,
// avoid unnecessary `child` evaluation in both codegen and non-codegen eval
// by checking if scaleV == null as well.
private lazy val scaleV: Any = scale.eval(EmptyRow)
private lazy val _scale: Int = scaleV.asInstanceOf[Int]

override def eval(input: InternalRow): Any = {
if (scaleV == null) { // if scale is null, no need to eval its child at all
null
} else {
val evalE = child.eval(input)
if (evalE == null) {
null
} else {
nullSafeEval(evalE)
}
}
}

// not overriding since _scale is a constant int at runtime
def nullSafeEval(input1: Any): Any = {
child.dataType match {
case _: DecimalType =>
val decimal = input1.asInstanceOf[Decimal]
if (decimal.changePrecision(decimal.precision, _scale)) decimal else null
case ByteType =>
BigDecimal(input1.asInstanceOf[Byte]).setScale(_scale, HALF_UP).toByte
case ShortType =>
BigDecimal(input1.asInstanceOf[Short]).setScale(_scale, HALF_UP).toShort
case IntegerType =>
BigDecimal(input1.asInstanceOf[Int]).setScale(_scale, HALF_UP).toInt
case LongType =>
BigDecimal(input1.asInstanceOf[Long]).setScale(_scale, HALF_UP).toLong
case FloatType =>
val f = input1.asInstanceOf[Float]
if (f.isNaN || f.isInfinite) {
f
} else {
BigDecimal(f).setScale(_scale, HALF_UP).toFloat
}
case DoubleType =>
val d = input1.asInstanceOf[Double]
if (d.isNaN || d.isInfinite) {
d
} else {
BigDecimal(d).setScale(_scale, HALF_UP).toDouble
}
}
}

override def genCode(ctx: CodeGenContext, ev: GeneratedExpressionCode): String = {
val ce = child.gen(ctx)

val evaluationCode = child.dataType match {
case _: DecimalType =>
s"""
if (${ce.primitive}.changePrecision(${ce.primitive}.precision(), ${_scale})) {
${ev.primitive} = ${ce.primitive};
} else {
${ev.isNull} = true;
}"""
case ByteType =>
if (_scale < 0) {
s"""
${ev.primitive} = new java.math.BigDecimal(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).byteValue();"""
} else {
s"${ev.primitive} = ${ce.primitive};"
}
case ShortType =>
if (_scale < 0) {
s"""
${ev.primitive} = new java.math.BigDecimal(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).shortValue();"""
} else {
s"${ev.primitive} = ${ce.primitive};"
}
case IntegerType =>
if (_scale < 0) {
s"""
${ev.primitive} = new java.math.BigDecimal(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).intValue();"""
} else {
s"${ev.primitive} = ${ce.primitive};"
}
case LongType =>
if (_scale < 0) {
s"""
${ev.primitive} = new java.math.BigDecimal(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).longValue();"""
} else {
s"${ev.primitive} = ${ce.primitive};"
}
case FloatType => // if child eval to NaN or Infinity, just return it.
if (_scale == 0) {
s"""
if (Float.isNaN(${ce.primitive}) || Float.isInfinite(${ce.primitive})){
${ev.primitive} = ${ce.primitive};
} else {
${ev.primitive} = Math.round(${ce.primitive});
}"""
} else {
s"""
if (Float.isNaN(${ce.primitive}) || Float.isInfinite(${ce.primitive})){
${ev.primitive} = ${ce.primitive};
} else {
${ev.primitive} = java.math.BigDecimal.valueOf(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).floatValue();
}"""
}
case DoubleType => // if child eval to NaN or Infinity, just return it.
if (_scale == 0) {
s"""
if (Double.isNaN(${ce.primitive}) || Double.isInfinite(${ce.primitive})){
${ev.primitive} = ${ce.primitive};
} else {
${ev.primitive} = Math.round(${ce.primitive});
}"""
} else {
s"""
if (Double.isNaN(${ce.primitive}) || Double.isInfinite(${ce.primitive})){
${ev.primitive} = ${ce.primitive};
} else {
${ev.primitive} = java.math.BigDecimal.valueOf(${ce.primitive}).
setScale(${_scale}, java.math.BigDecimal.ROUND_HALF_UP).doubleValue();
}"""
}
}

if (scaleV == null) { // if scale is null, no need to eval its child at all
s"""
boolean ${ev.isNull} = true;
${ctx.javaType(dataType)} ${ev.primitive} = ${ctx.defaultValue(dataType)};
"""
} else {
s"""
${ce.code}
boolean ${ev.isNull} = ${ce.isNull};
${ctx.javaType(dataType)} ${ev.primitive} = ${ctx.defaultValue(dataType)};
if (!${ev.isNull}) {
$evaluationCode
}
"""
}
}

override def prettyName: String = "round"
}
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,13 @@ class ExpressionTypeCheckingSuite extends SparkFunSuite {
s"differing types in '${expr.prettyString}' (int and boolean)")
}

def assertErrorWithImplicitCast(expr: Expression, errorMessage: String): Unit = {
val e = intercept[AnalysisException] {
assertSuccess(expr)
}
assert(e.getMessage.contains(errorMessage))
}

test("check types for unary arithmetic") {
assertError(UnaryMinus('stringField), "operator - accepts numeric type")
assertError(Abs('stringField), "function abs accepts numeric type")
Expand Down Expand Up @@ -171,4 +178,14 @@ class ExpressionTypeCheckingSuite extends SparkFunSuite {
CreateNamedStruct(Seq('a.string.at(0), "a", "b", 2.0)),
"Odd position only allow foldable and not-null StringType expressions")
}

test("check types for ROUND") {
assertErrorWithImplicitCast(Round(Literal(null), 'booleanField),
"data type mismatch: argument 2 is expected to be of type int")
assertErrorWithImplicitCast(Round(Literal(null), 'complexField),
"data type mismatch: argument 2 is expected to be of type int")
assertSuccess(Round(Literal(null), Literal(null)))
assertError(Round('booleanField, 'intField),
"data type mismatch: argument 1 is expected to be of type numeric")
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@

package org.apache.spark.sql.catalyst.expressions

import scala.math.BigDecimal.RoundingMode

import com.google.common.math.LongMath

import org.apache.spark.SparkFunSuite
Expand Down Expand Up @@ -336,4 +338,46 @@ class MathFunctionsSuite extends SparkFunSuite with ExpressionEvalHelper {
null,
create_row(null))
}

test("round") {
val domain = -6 to 6
val doublePi: Double = math.Pi
val shortPi: Short = 31415
val intPi: Int = 314159265
val longPi: Long = 31415926535897932L
val bdPi: BigDecimal = BigDecimal(31415927L, 7)

val doubleResults: Seq[Double] = Seq(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 3.1, 3.14, 3.142,
3.1416, 3.14159, 3.141593)

val shortResults: Seq[Short] = Seq[Short](0, 0, 30000, 31000, 31400, 31420) ++
Seq.fill[Short](7)(31415)

val intResults: Seq[Int] = Seq(314000000, 314200000, 314160000, 314159000, 314159300,
314159270) ++ Seq.fill(7)(314159265)

val longResults: Seq[Long] = Seq(31415926536000000L, 31415926535900000L,
31415926535900000L, 31415926535898000L, 31415926535897900L, 31415926535897930L) ++
Seq.fill(7)(31415926535897932L)

val bdResults: Seq[BigDecimal] = Seq(BigDecimal(3.0), BigDecimal(3.1), BigDecimal(3.14),
BigDecimal(3.142), BigDecimal(3.1416), BigDecimal(3.14159),
BigDecimal(3.141593), BigDecimal(3.1415927))

domain.zipWithIndex.foreach { case (scale, i) =>
checkEvaluation(Round(doublePi, scale), doubleResults(i), EmptyRow)
checkEvaluation(Round(shortPi, scale), shortResults(i), EmptyRow)
checkEvaluation(Round(intPi, scale), intResults(i), EmptyRow)
checkEvaluation(Round(longPi, scale), longResults(i), EmptyRow)
}

// round_scale > current_scale would result in precision increase
// and not allowed by o.a.s.s.types.Decimal.changePrecision, therefore null
(0 to 7).foreach { i =>
checkEvaluation(Round(bdPi, i), bdResults(i), EmptyRow)
}
(8 to 10).foreach { scale =>
checkEvaluation(Round(bdPi, scale), null, EmptyRow)
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -24,14 +24,14 @@ import org.scalatest.PrivateMethodTester
import scala.language.postfixOps

class DecimalSuite extends SparkFunSuite with PrivateMethodTester {
test("creating decimals") {
/** Check that a Decimal has the given string representation, precision and scale */
def checkDecimal(d: Decimal, string: String, precision: Int, scale: Int): Unit = {
assert(d.toString === string)
assert(d.precision === precision)
assert(d.scale === scale)
}
/** Check that a Decimal has the given string representation, precision and scale */
private def checkDecimal(d: Decimal, string: String, precision: Int, scale: Int): Unit = {
assert(d.toString === string)
assert(d.precision === precision)
assert(d.scale === scale)
}

test("creating decimals") {
checkDecimal(new Decimal(), "0", 1, 0)
checkDecimal(Decimal(BigDecimal("10.030")), "10.030", 5, 3)
checkDecimal(Decimal(BigDecimal("10.030"), 4, 1), "10.0", 4, 1)
Expand All @@ -53,6 +53,15 @@ class DecimalSuite extends SparkFunSuite with PrivateMethodTester {
intercept[IllegalArgumentException](Decimal(1e17.toLong, 17, 0))
}

test("creating decimals with negative scale") {
checkDecimal(Decimal(BigDecimal("98765"), 5, -3), "9.9E+4", 5, -3)
checkDecimal(Decimal(BigDecimal("314.159"), 6, -2), "3E+2", 6, -2)
checkDecimal(Decimal(BigDecimal(1.579e12), 4, -9), "1.579E+12", 4, -9)
checkDecimal(Decimal(BigDecimal(1.579e12), 4, -10), "1.58E+12", 4, -10)
checkDecimal(Decimal(103050709L, 9, -10), "1.03050709E+18", 9, -10)
checkDecimal(Decimal(1e8.toLong, 10, -10), "1.00000000E+18", 10, -10)
}

test("double and long values") {
/** Check that a Decimal converts to the given double and long values */
def checkValues(d: Decimal, doubleValue: Double, longValue: Long): Unit = {
Expand Down
32 changes: 32 additions & 0 deletions sql/core/src/main/scala/org/apache/spark/sql/functions.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1389,6 +1389,38 @@ object functions {
*/
def rint(columnName: String): Column = rint(Column(columnName))

/**
* Returns the value of the column `e` rounded to 0 decimal places.
*
* @group math_funcs
* @since 1.5.0
*/
def round(e: Column): Column = round(e.expr, 0)

/**
* Returns the value of the given column rounded to 0 decimal places.
*
* @group math_funcs
* @since 1.5.0
*/
def round(columnName: String): Column = round(Column(columnName), 0)

/**
* Returns the value of `e` rounded to `scale` decimal places.
*
* @group math_funcs
* @since 1.5.0
*/
def round(e: Column, scale: Int): Column = Round(e.expr, Literal(scale))

/**
* Returns the value of the given column rounded to `scale` decimal places.
*
* @group math_funcs
* @since 1.5.0
*/
def round(columnName: String, scale: Int): Column = round(Column(columnName), scale)

/**
* Shift the the given value numBits left. If the given value is a long value, this function
* will return a long value else it will return an integer value.
Expand Down
Loading