Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-45981 Improve Python language test coverage
  3. SPARK-46127

Flaky `pyspark.tests.test_worker.WorkerSegfaultNonDaemonTest.test_python_segfault` with Python 3.12

    XMLWordPrintableJSON

Details

    Description

      Traceback (most recent call last):
        File "/__w/spark/spark/python/pyspark/tests/test_worker.py", line 241, in test_python_segfault
          self.sc.parallelize([1]).map(lambda x: f()).count()
        File "/__w/spark/spark/python/pyspark/rdd.py", line 2315, in count
          return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/__w/spark/spark/python/pyspark/rdd.py", line 2290, in sum
          return self.mapPartitions(lambda x: [sum(x)]).fold(  # type: ignore[return-value]
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/__w/spark/spark/python/pyspark/rdd.py", line 2043, in fold
          vals = self.mapPartitions(func).collect()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/__w/spark/spark/python/pyspark/rdd.py", line 1832, in collect
          sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1322, in __call__
          return_value = get_return_value(
                         ^^^^^^^^^^^^^^^^^
        File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 326, in get_return_value
          raise Py4JJavaError(
      py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
      : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (localhost executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:560)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:535)
      	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:35)
      	at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:863)
      	at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:843)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:473)
      	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      	at scala.collection.mutable.Growable.addAll(Growable.scala:61)
      	at scala.collection.mutable.Growable.addAll$(Growable.scala:57)
      	at scala.collection.mutable.ArrayBuilder.addAll(ArrayBuilder.scala:67)
      	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
      	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
      	at py4j.Gateway.invoke(Gateway.java:282)
      	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
      	at py4j.commands.CallCommand.execute(CallCommand.java:79)
      	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
      	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
      	at java.base/java.lang.Thread.run(Thread.java:840)
      Caused by: org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:560)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:535)
      	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:35)
      	at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:863)
      	at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:843)
      	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:473)
      	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      	at scala.collection.mutable.Growable.addAll(Growable.scala:61)
      	at scala.collection.mutable.Growable.addAll$(Growable.scala:57)
      	at scala.collection.mutable.ArrayBuilder.addAll(ArrayBuilder.scala:67)
      	at scala.collection.IterableOnceOps.toArray(IterableOnce.scala:1346)
      	at scala.collection.IterableOnceOps.toArray$(IterableOnce.scala:1339)
      	at org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)
      	at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1047)
      	at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2468)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
      	at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
      	at org.apache.spark.scheduler.Task.run(Task.scala:141)
      	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:628)
      	at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
      	at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
      	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:96)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:631)
      	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
      	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
      	... 1 more
      Caused by: java.io.EOFException
      	at java.base/java.io.DataInputStream.readInt(DataInputStream.java:386)
      	at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:851)
      	... 22 more
      
      
      During handling of the above exception, another exception occurred:
      
      Traceback (most recent call last):
        File "/__w/spark/spark/python/pyspark/tests/test_worker.py", line 243, in test_python_segfault
          self.assertRegex(str(e), "Segmentation fault")
      AssertionError: Regex didn't match: 'Segmentation fault' not found in 'An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.\n: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (localhost executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:560)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:535)\n\tat scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:35)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:863)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:843)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:473)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat scala.collection.mutable.Growable.addAll(Growable.scala:61)\n\tat scala.collection.mutable.Growable.addAll$(Growable.scala:57)\n\tat scala.collection.mutable.ArrayBuilder.addAll(ArrayBuilder.scala:67)\n\tat scala.collection.IterableOnceOps.toArray(IterableOnce.scala:1346)\n\tat scala.collection.IterableOnceOps.toArray$(IterableOnce.scala:1339)\n\tat org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)\n\tat org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1047)\n\tat org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2468)\n\tat org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)\n\tat org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:141)\n\tat org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:628)\n\tat org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)\n\tat org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:96)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:631)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:840)\nCaused by: java.io.EOFException\n\tat java.base/java.io.DataInputStream.readInt(DataInputStream.java:386)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:851)\n\t... 22 more\n\nDriver stacktrace:\n\tat org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2820)\n\tat org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2817)\n\tat scala.collection.immutable.List.foreach(List.scala:333)\n\tat org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2817)\n\tat org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1258)\n\tat org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1258)\n\tat scala.Option.foreach(Option.scala:437)\n\tat org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1258)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3087)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3021)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3010)\n\tat org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)\n\tat org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:990)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2428)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2449)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2468)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2493)\n\tat org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1047)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)\n\tat org.apache.spark.rdd.RDD.withScope(RDD.scala:408)\n\tat org.apache.spark.rdd.RDD.collect(RDD.scala:1046)\n\tat org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:196)\n\tat org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:568)\n\tat py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\n\tat py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)\n\tat py4j.Gateway.invoke(Gateway.java:282)\n\tat py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\n\tat py4j.commands.CallCommand.execute(CallCommand.java:79)\n\tat py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)\n\tat py4j.ClientServerConnection.run(ClientServerConnection.java:106)\n\tat java.base/java.lang.Thread.run(Thread.java:840)\nCaused by: org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:560)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator$$anonfun$1.applyOrElse(PythonRunner.scala:535)\n\tat scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:35)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:863)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:843)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:473)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat scala.collection.mutable.Growable.addAll(Growable.scala:61)\n\tat scala.collection.mutable.Growable.addAll$(Growable.scala:57)\n\tat scala.collection.mutable.ArrayBuilder.addAll(ArrayBuilder.scala:67)\n\tat scala.collection.IterableOnceOps.toArray(IterableOnce.scala:1346)\n\tat scala.collection.IterableOnceOps.toArray$(IterableOnce.scala:1339)\n\tat org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)\n\tat org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1047)\n\tat org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2468)\n\tat org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)\n\tat org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:141)\n\tat org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:628)\n\tat org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)\n\tat org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:96)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:631)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\t... 1 more\nCaused by: java.io.EOFException\n\tat java.base/java.io.DataInputStream.readInt(DataInputStream.java:386)\n\tat org
      

      This is flaky https://github.com/apache/spark/actions/runs/6996322044/job/19032101353

      Attachments

        Activity

          People

            gurwls223 Hyukjin Kwon
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: