Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33772 Build and Run Spark on Java 17
  3. SPARK-38534

Disable to_timestamp('366', 'DD') test case

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.3.0
    • 3.3.0
    • SQL, Tests
    • None

    Description

      Currently, Daily Java 11 and 17 build are broken.

      *Java 8*

      $ bin/spark-shell --conf spark.sql.ansi.enabled=true
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
      22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      Spark context Web UI available at http://172.16.0.31:4040
      Spark context available as 'sc' (master = local[*], app id = local-1647075572229).
      Spark session available as 'spark'.
      Welcome to
            ____              __
           / __/__  ___ _____/ /__
          _\ \/ _ \/ _ `/ __/  '_/
         /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
            /_/
      
      Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322)
      Type in expressions to have them evaluated.
      Type :help for more information.
      
      scala> sql("select to_timestamp('366', 'DD')").show
      java.time.format.DateTimeParseException: Text '366' could not be parsed, unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to false to bypass this error.
      

      *Java 11+*

      $ bin/spark-shell --conf spark.sql.ansi.enabled=true
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
      22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      Spark context Web UI available at http://172.16.0.31:4040
      Spark context available as 'sc' (master = local[*], app id = local-1647075607932).
      Spark session available as 'spark'.
      Welcome to
            ____              __
           / __/__  ___ _____/ /__
          _\ \/ _ \/ _ `/ __/  '_/
         /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
            /_/
      
      Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12)
      Type in expressions to have them evaluated.
      Type :help for more information.
      
      scala> sql("select to_timestamp('366', 'DD')").show
      java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not a leap year. If necessary set spark.sql.ansi.enabled to false to bypass this error.
      

      Attachments

        Activity

          People

            dongjoon Dongjoon Hyun
            dongjoon Dongjoon Hyun
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: