Uploaded image for project: 'Sqoop (Retired)'
  1. Sqoop (Retired)
  2. SQOOP-3137

Sqoop not supporting Mainframe Sequential datasets, kindly help

    XMLWordPrintableJSON

Details

    Description

      I don't get information on this anywhere in net , if it is already fixed tell me what needs to do to work ,its very urgent , 90 percent of mainframe files are in sequentail , also kindly update in sqoop guide , so everyone will knows this

      Version used in our Hadoop:
      bash-4.1$ sqoop version
      Warning: /usr/hdp/2.4.2.0-258/accumulo does not exist! Accumulo imports will fail.
      Please set $ACCUMULO_HOME to the root of your Accumulo installation.
      17/02/20 08:38:53 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
      Sqoop 1.4.6.2.4.2.0-258
      git commit id 625a64c91fe47b5c1c4d785de8d53100ba50e60a
      Compiled by jenkins on Mon Apr 25 08:00:47 UTC 2016
      bash-4.1$

      Command Used in Hadoop:
      sqoop import-mainframe --connect sysv.pershing.com --dataset T.EBS.R100.GBKPDET3.ORTDE30 --username TSOMSUM --P --target-dir /addh1060/nanda/TstSqop
      Error (not detecting file ):
      drwxr-xr-x - xbbl1wx addh1060 0 2017-01-06 04:48 addh1060/nanda/Copybook
      bash-4.1$ sqoop import-mainframe --connect sysv.pershing.com --dataset T.EBS.R100.GBKPDET3.ORTDE30 --username TSOMSUM --P --target-dir /addh1060/nanda/TstSqop
      Warning: /usr/hdp/2.4.2.0-258/accumulo does not exist! Accumulo imports will fail.
      Please set $ACCUMULO_HOME to the root of your Accumulo installation.

      17/02/20 08:13:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
      Enter password:
      17/02/20 08:14:02 INFO tool.CodeGenTool: Beginning code generation
      17/02/20 08:14:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.4.2.0-258/hadoop-mapreduce
      Note: /tmp/sqoop-xbbl1wx/compile/8416e1c82ec6c0eb663fd0e48470c159/T_EBS_R100_GBKPDET3_ORTDE30.java uses or overrides a deprecated API.
      Note: Recompile with -Xlint:deprecation for details.
      17/02/20 08:14:04 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-xbbl1wx/compile/8416e1c82ec6c0eb663fd0e48470c159/T.EBS.R100.GBKPDET3.ORTDE30.jar
      17/02/20 08:14:04 INFO mapreduce.ImportJobBase: Beginning import of T.EBS.R100.GBKPDET3.ORTDE30
      SLF4J: Class path contains multiple SLF4J bindings.
      SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      17/02/20 08:14:04 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
      17/02/20 08:14:05 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
      17/02/20 08:14:05 WARN util.Jars: No such class available.
      17/02/20 08:14:06 INFO impl.TimelineClientImpl: Timeline service address: http://r37bn00.bnymellon.net:8188/ws/v1/timeline/
      17/02/20 08:14:06 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 246754 for xbbl1wx on ha-hdfs:myclustertesttpc
      17/02/20 08:14:06 INFO security.TokenCache: Got dt for hdfs://myclustertesttpc; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:myclustertesttpc, Ident: (HDFS_DELEGATION_TOKEN token 246754 for xbbl1wx)
      17/02/20 08:14:06 WARN token.Token: Cannot find class for token kind kms-dt
      17/02/20 08:14:06 INFO security.TokenCache: Got dt for hdfs://myclustertesttpc; Kind: kms-dt, Service: 10.59.90.136:9292, Ident: 00 07 78 62 62 6c 31 77 78 07 64 73 68 6d 79 74 6d 00 8a 01 5a 5b a8 63 81 8a 01 5a 7f b4 e7 81 8e b8 20 1f
      17/02/20 08:14:06 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm2
      17/02/20 08:14:08 INFO mainframe.MainframeDatasetInputFormat: Datasets to transfer from: T.EBS.R100.GBKPDET3.ORTDE30
      17/02/20 08:14:08 INFO util.MainframeFTPClientUtils: Connected to sysv.pershing.com on 21
      17/02/20 08:14:08 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/xbbl1wx/.staging/job_1485005128036_45849
      17/02/20 08:14:08 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:* Could not list datasets* from T.EBS.R100.GBKPDET3.ORTDE30:java.io.IOException: Could not login to server sysv.pershing.com:221 Quit command received. Goodbye.

      Attachments

        Activity

          People

            Unassigned Unassigned
            Nandakishore Nandakishore
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: