Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-15849

FileNotFoundException on _temporary while doing saveAsTable to S3

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 1.6.1
    • None
    • Spark Core
    • None
    • AWS EC2 with spark on yarn and s3 storage

    Description

      When submitting spark jobs to yarn cluster, I occasionally see these error messages while doing saveAsTable. I have tried doing this with spark.speculation=false, and get the same error. These errors are similar to SPARK-2984, but my jobs are writing to S3(s3n) :

      Caused by: java.io.FileNotFoundException: File s3n://xxxxxxx/_temporary/0/task_201606080516_0004_m_000079 does not exist.
      at org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:506)
      at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:360)
      at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitJob(FileOutputCommitter.java:310)
      at org.apache.parquet.hadoop.ParquetOutputCommitter.commitJob(ParquetOutputCommitter.java:46)
      at org.apache.spark.sql.execution.datasources.BaseWriterContainer.commitJob(WriterContainer.scala:230)
      at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelation.scala:151)
      ... 42 more

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              sandeepb Sandeep
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: