Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-36907

pandas API on Spark: DataFrameGroupBy.apply raises an exception when it returns Series.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.2.0, 3.3.0
    • 3.2.0
    • PySpark
    • None

    Description

      DataFrameGroupBy.apply without shortcut could raise an exception when it returns Series.

      >>> ps.options.compute.shortcut_limit = 3
      >>> psdf = ps.DataFrame(
      ...     {"a": [1, 2, 3, 4, 5, 6], "b": [1, 1, 2, 3, 5, 8], "c": [1, 4, 9, 16, 25, 36]},
      ...     columns=["a", "b", "c"],
      ... )
      >>> psdf.groupby("b").apply(lambda x: x["a"])
      org.apache.spark.api.python.PythonException: Traceback (most recent call last):
      ...
      ValueError: Length mismatch: Expected axis has 2 elements, new values have 3 elements
      

      Attachments

        Activity

          People

            apachespark Apache Spark
            ueshin Takuya Ueshin
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: