Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-6812

Convert keys to ByteArray in Combine.perKey for Spark

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Critical
    • Resolution: Fixed
    • Affects Version/s: 2.10.0
    • Fix Version/s: 2.12.0
    • Component/s: runner-spark
    • Labels:
      None

      Description

      • During calls to Combine.perKey, we want they keys used to have consistent hashCode when invoked from different JVM's.
      • However, while testing this in our company we found out that when using protobuf as keys during combine, the hashCodes can be different for the same key when invoked from different JVMs. This results in duplicates. 
      • `ByteArray` class in Spark has a stable has code when dealing with arrays as well. 
      • GroupByKey correctly converts keys to `ByteArray` and uses coders for serialization.
      • The fix does something similar when dealing with combines.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                jhalarua Ankit Jhalaria
                Reporter:
                jhalarua Ankit Jhalaria
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 2h 50m
                  2h 50m