Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30374 Feature Parity between PostgreSQL and Spark (ANSI/SQL)
  3. SPARK-23179

Support option to throw exception if overflow occurs during Decimal arithmetic

Attach filesAttach ScreenshotVotersWatch issueWatchersLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.0
    • 3.0.0
    • SQL
    • None

    Description

      SQL ANSI 2011 states that in case of overflow during arithmetic operations, an exception should be thrown. This is what most of the SQL DBs do (eg. SQLServer, DB2). Hive currently returns NULL (as Spark does) but HIVE-18291 is open to be SQL compliant.

      I propose to have a config option which allows to decide whether Spark should behave according to SQL standards or in the current way (ie. returning NULL).

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            mgaido Marco Gaido
            mgaido Marco Gaido
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment