Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-8367

Python BigQuery sink should use unique IDs for mode STREAMING_INSERTS

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 2.17.0
    • Component/s: sdk-py-core
    • Labels:
      None

      Description

      Unique IDs ensure (best effort) that writes to BigQuery are idempotent, for example, we don't write the same record twice in a VM failure.

       

      Currently Python BQ sink insert BQ IDs here but they'll be re-generated in a VM failure resulting in data duplication.

      https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/bigquery.py#L766

       

      Correct fix is to do a Reshuffle to checkpoint unique IDs once they are generated, similar to how Java BQ sink operates.

      https://github.com/apache/beam/blob/dcf6ad301069e4d2cfaec5db6b178acb7bb67f49/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/StreamingWriteTables.java#L225

       

      Pablo, can you do an initial assessment here ?

      I think this is a relatively small fix but I might be wrong.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                pabloem Pablo Estrada
                Reporter:
                chamikara Chamikara Madhusanka Jayalath
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 2h
                  2h