Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13153

PySpark ML persistence failed when handle no default value parameter

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • 1.6.1, 2.0.0
    • ML, PySpark
    • None
    • 13033

    Description

      This defect find when implement task spark-13033. When add below code to doctest.
      It looks like _transfer_params_from_java did not consider the params which do not have default value and we should handle them.

      >>> import os, tempfile
      >>> path = tempfile.mkdtemp()
      >>> aftsr_path = path + "/aftsr"
      >>> aftsr.save(aftsr_path)
      >>> aftsr2 = AFTSurvivalRegression.load(aftsr_path)

      Exception detail.
      ir2 = IsotonicRegression.load(ir_path)
      Exception raised:
      Traceback (most recent call last):
      File "C:\Python27\lib\doctest.py", line 1289, in run
      compileflags, 1) in test.globs
      File "<doctest __main.IsotonicRegression[11]>", line 1, in
      ir2 = IsotonicRegression.load(ir_path)
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\util.py", line 194, in load
      return cls.read().load(path)
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\util.py", line 148, in load
      instance.transfer_params_from_java()
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\wrapper.py", line 82, in tran
      fer_params_from_java
      value = _java2py(sc, self._java_obj.getOrDefault(java_param))
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\py4j-0.9-src.zip\py4j\java_gateway.py", line 813, in
      _call
      answer, self.gateway_client, self.target_id, self.name)
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\sql\utils.py", line 45, in deco
      return f(a, *kw)
      File "C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\py4j-0.9-src.zip\py4j\protocol.py", line 308, in get_
      eturn_value
      format(target_id, ".", name), value)
      Py4JJavaError: An error occurred while calling o351.getOrDefault.
      : java.util.NoSuchElementException: Failed to find a default value for weightCol
      at org.apache.spark.ml.param.Params$$anonfun$getOrDefault$2.apply(params.scala:647)
      at org.apache.spark.ml.param.Params$$anonfun$getOrDefault$2.apply(params.scala:647)
      at scala.Option.getOrElse(Option.scala:120)
      at org.apache.spark.ml.param.Params$class.getOrDefault(params.scala:646)
      at org.apache.spark.ml.PipelineStage.getOrDefault(Pipeline.scala:43)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:483)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
      at py4j.Gateway.invoke(Gateway.java:259)
      at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
      at py4j.commands.CallCommand.execute(CallCommand.java:79)
      at py4j.GatewayConnection.run(GatewayConnection.java:209)
      at java.lang.Thread.run(Thread.java:745)

      Attachments

        Issue Links

          Activity

            People

              tommy_cug Wenpei Yu
              tommy_cug Wenpei Yu
              Yanbo Liang Yanbo Liang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: