Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21239

Support WAL recover in windows

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.6.3, 2.1.0, 2.1.1, 2.2.0
    • None
    • DStreams, Windows
    • None

    Description

      When driver failed over, it will read WAL from HDFS by calling WriteAheadLogBackedBlockRDD.getBlockFromWriteAheadLog(), however, it need a dummy local path to satisfy the method parameter requirements, but the path in windows will contain a colon which is not valid for hadoop. I removed the potential driver letter and colon.

      I found one email from spark-user ever talked about this bug (https://www.mail-archive.com/user@spark.apache.org/msg55030.html)

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              yunta Yun Tang
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: