Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26806

EventTimeStats.merge doesn't handle "zero.merge(zero)" correctly

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.1, 2.2.2, 2.2.3, 2.3.0, 2.3.1, 2.3.2, 2.3.3, 2.4.0
    • 2.2.4, 2.3.3, 2.4.1, 3.0.0
    • Structured Streaming
    • None

    Description

      Right now, EventTimeStats.merge doesn't handle "zero.merge(zero)". This will make "avg" become "NaN". And whatever gets merged with the result of "zero.merge(zero)", "avg" will still be "NaN". Then finally, "NaN".toLong will return "0" and the user will see the following incorrect report:

      "eventTime" : {
          "avg" : "1970-01-01T00:00:00.000Z",
          "max" : "2019-01-31T12:57:00.000Z",
          "min" : "2019-01-30T18:44:04.000Z",
          "watermark" : "1970-01-01T00:00:00.000Z"
        }
      

      Attachments

        Issue Links

          Activity

            People

              zsxwing Shixiong Zhu
              lian cheng Cheng Lian
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: