Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-38324

The second range is not [0, 59] in the day time ANSI interval

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.3.0
    • 3.5.0
    • Java API
    • None
    • Spark 3.3.0 snapshot

    Description

      https://spark.apache.org/docs/latest/sql-ref-datatypes.html

      • SECOND, seconds within minutes and possibly fractions of a second [0..59.999999]{}

      Doc shows SECOND is seconds within minutes, it's range should be [0, 59]

       

      But testing shows 99 second is valid:

      >>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")
      DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to second]{}

       

      Meanwhile, minute range check is ok, see below:

      >>> spark.sql("select INTERVAL '10 01:60:01' DAY TO SECOND")
      requirement failed: minute 60 outside range [0, 59](line 1, pos 16)

      == SQL ==
      select INTERVAL '10 01:60:01' DAY TO SECOND
      ----------------^^^

       

      Attachments

        Activity

          People

            chongg@nvidia chong
            chongg@nvidia chong
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: