Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-40624

A DECIMAL value with division by 0 errors in DataFrame but evaluates to NULL in SparkSQL

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 3.2.1
    • None
    • Spark Shell
    • None

    Description

      Describe the bug

      Storing an invalid value (e.g. BigDecimal("1.0/0")) via spark-shell errors out during RDD creation. However, 1.0/0 evaluates to NULL if the value is inserted into a DECIMAL(20,10) column of a table via spark-sql.

      To Reproduce

      On Spark 3.2.1 (commit 4f25b3f712), using spark-sql:

      $SPARK_HOME/bin/spark-sql

      Execute the following: (evaluated to NULL)

      spark-sql> create table decimal_vals(c1 DECIMAL(20,10)) stored as ORC;
      spark-sql> insert into decimal_vals select 1.0/0;
      spark-sql> select * from decimal_vals;
      NULL

      Using spark-shell:

      $SPARK_HOME/bin/spark-shell

      Execute the following: (errors out during RDD creation)

      scala> import org.apache.spark.sql.{Row, SparkSession}
      import org.apache.spark.sql.{Row, SparkSession}
      scala> import org.apache.spark.sql.types._
      import org.apache.spark.sql.types._
      scala> val rdd = sc.parallelize(Seq(Row(BigDecimal("1.0/0"))))
      java.lang.NumberFormatException
        at java.math.BigDecimal.<init>(BigDecimal.java:497)
        at java.math.BigDecimal.<init>(BigDecimal.java:383)
        at java.math.BigDecimal.<init>(BigDecimal.java:809)
        at scala.math.BigDecimal$.exact(BigDecimal.scala:126)
        at scala.math.BigDecimal$.apply(BigDecimal.scala:284)
        ... 49 elided

      Expected behavior

      We expect the two Spark interfaces (spark-sql & spark-shell) to behave consistently for the same data type & input combination (BigDecimal/DECIMAL(20,10) and 1.0/0).

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            x/sys xsys
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: