Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33121

Spark Streaming 3.1.1 hangs on shutdown

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.1.1
    • None
    • DStreams

    Description

      Hi. I am trying to migrate from spark 2.4.5 to 3.1.1 and there is a problem in graceful shutdown.

      Config parameter "spark.streaming.stopGracefullyOnShutdown" is set as "true".

      Here is the code:

      inputStream.foreachRDD {
        rdd =>
          rdd.foreachPartition {
              Thread.sleep(5000)
          }
      }
      

      I send a SIGTERM signal to stop the spark streaming and after sleeping an exception arises:

      streaming-agg-tds-data_1  | java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@7ca7f0b8 rejected from java.util.concurrent.ThreadPoolExecutor@2474219c[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 1]
      streaming-agg-tds-data_1  |     at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
      streaming-agg-tds-data_1  |     at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
      streaming-agg-tds-data_1  |     at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
      streaming-agg-tds-data_1  |     at org.apache.spark.executor.Executor.launchTask(Executor.scala:270)
      streaming-agg-tds-data_1  |     at org.apache.spark.scheduler.local.LocalEndpoint.$anonfun$reviveOffers$1(LocalSchedulerBackend.scala:93)
      streaming-agg-tds-data_1  |     at org.apache.spark.scheduler.local.LocalEndpoint.$anonfun$reviveOffers$1$adapted(LocalSchedulerBackend.scala:91)
      streaming-agg-tds-data_1  |     at scala.collection.Iterator.foreach(Iterator.scala:941)
      streaming-agg-tds-data_1  |     at scala.collection.Iterator.foreach$(Iterator.scala:941)
      streaming-agg-tds-data_1  |     at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
      streaming-agg-tds-data_1  |     at scala.collection.IterableLike.foreach(IterableLike.scala:74)
      streaming-agg-tds-data_1  |     at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
      streaming-agg-tds-data_1  |     at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
      streaming-agg-tds-data_1  |     at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalSchedulerBackend.scala:91)
      streaming-agg-tds-data_1  |     at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:68)
      streaming-agg-tds-data_1  |     at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:115)
      streaming-agg-tds-data_1  |     at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
      streaming-agg-tds-data_1  |     at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
      streaming-agg-tds-data_1  |     at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
      streaming-agg-tds-data_1  |     at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
      streaming-agg-tds-data_1  |     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      streaming-agg-tds-data_1  |     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      streaming-agg-tds-data_1  |     at java.lang.Thread.run(Thread.java:748)
      streaming-agg-tds-data_1  | 2021-04-22 13:33:41 WARN  JobGenerator - Timed out while stopping the job generator (timeout = 10000)
      streaming-agg-tds-data_1  | 2021-04-22 13:33:41 INFO  JobGenerator - Waited for jobs to be processed and checkpoints to be written
      streaming-agg-tds-data_1  | 2021-04-22 13:33:41 INFO  JobGenerator - Stopped JobGenerator

      After this exception and "JobGenerator - Stopped JobGenerator" log, streaming freezes, and halts by timeout (Config parameter "hadoop.service.shutdown.timeout").

      Besides, there is no problem with the graceful shutdown in spark 2.4.5.

      Attachments

        Activity

          People

            Unassigned Unassigned
            tverdokhlebd Dmitry Tverdokhleb
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated: