Details
-
Bug
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
None
-
None
-
None
Description
I'm able to get the JavaHBaseContext Object From PySpark but not HBaseContext
temp = sc._jvm.org.apache.hadoop.hbase.HBaseConfiguration conf = temp.create() hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf)
Running the above Code gives me this error
AttributeError: 'SparkContext' object has no attribute '_get_object_id' AttributeError Traceback (most recent call last) in engine ----> 1 hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf) /usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in _call_(self, *args) 1543 1544 args_command = "".join( -> 1545 [get_command_part(arg, self._pool) for arg in new_args]) 1546 1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\ /usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in <listcomp>(.0) 1543 1544 args_command = "".join( -> 1545 [get_command_part(arg, self._pool) for arg in new_args]) 1546 1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\ /usr/local/lib/python3.6/site-packages/py4j/protocol.py in get_command_part(parameter, python_proxy_pool) 296 command_part += ";" + interface 297 else: --> 298 command_part = REFERENCE_TYPE + parameter._get_object_id() 299 300 command_part += "\n" AttributeError: 'SparkContext' object has no attribute '_get_object_id'