Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-41554

Decimal.changePrecision produces ArrayIndexOutOfBoundsException

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.3.1
    • 3.2.4, 3.3.2, 3.4.0
    • SQL
    • None

    Description

      Reducing Decimal scale by more than 18 produces exception.

      Decimal(1, 38, 19).changePrecision(38, 0)
      java.lang.ArrayIndexOutOfBoundsException: 19
          at org.apache.spark.sql.types.Decimal.changePrecision(Decimal.scala:377)
          at org.apache.spark.sql.types.Decimal.changePrecision(Decimal.scala:328)

      Reproducing with SQL query:

      sql("select cast(cast(cast(cast(id as decimal(38,15)) as decimal(38,30)) as decimal(38,37)) as decimal(38,17)) from range(3)").show

      The bug exists for Decimal that is stored using compact long only, it works fine with Decimal that uses scala.math.BigDecimal internally.

      Attachments

        Activity

          People

            fe2s Oleksiy Dyagilev
            fe2s Oleksiy Dyagilev
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: