Uploaded image for project: 'Apache Arrow'
  1. Apache Arrow
  2. ARROW-7841

[C++] HADOOP_HOME doesn't work to find libhdfs.so

    XMLWordPrintableJSON

Details

    Description

      I have my env variable setup correctly according to the pyarrow README

      $ ls $HADOOP_HOME/lib/native
      libhadoop.a  libhadooppipes.a  libhadoop.so  libhadoop.so.1.0.0  libhadooputils.a  libhdfs.a  libhdfs.so  libhdfs.so.0.0.0 

      Use the following script to reproduce

      import pyarrow
      pyarrow.hdfs.connect('hdfs://localhost')

      With pyarrow version 0.15.1 it is fine.

      However, version 0.16.0 will give error

      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "/home/jackwindows/anaconda2/lib/python2.7/site-packages/pyarrow/hdfs.py", line 215, in connect
          extra_conf=extra_conf)
        File "/home/jackwindows/anaconda2/lib/python2.7/site-packages/pyarrow/hdfs.py", line 40, in __init__
          self._connect(host, port, user, kerb_ticket, driver, extra_conf)
        File "pyarrow/io-hdfs.pxi", line 89, in pyarrow.lib.HadoopFileSystem._connect
        File "pyarrow/error.pxi", line 99, in pyarrow.lib.check_status
      IOError: Unable to load libhdfs: /opt/hadoop/latest/libhdfs.so: cannot open shared object file: No such file or directory 

      Attachments

        Issue Links

          Activity

            People

              kou Kouhei Sutou
              JackWindows Jack Fan
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 1h 20m
                  1h 20m