AnsweredAssumed Answered

Trouble starting sqoop 2 job

Question asked by kannu.kumar on Nov 17, 2016
Latest reply on Dec 12, 2016 by Rachel Silver

Hello Community,

 

As we known sqoop 2 is entirely designed to sqoop data from relational database to HDFS or vise versa.

I am running single node Hadoop Instance called "maprdemo" which i have downloaded from the Mapr website.

i followed through the document provided here Command Line Shell — Apache Sqoop documentation and managed to create links and job with an id 1. The problem persists with running the job and i get the following errors. I have set the verbose value to true so we the see the actual error result.

 

Details about the job : - 

sqoop:000> show connector
0 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
+----+------------------------+------------------+------------------------------------------------------+----------------------+
| Id | Name | Version | Class | Supported Directions |
+----+------------------------+------------------+------------------------------------------------------+----------------------+
| 1 | kite-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.kite.KiteConnector | FROM/TO |
| 2 | kafka-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.kafka.KafkaConnector | TO |
| 3 | hdfs-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.hdfs.HdfsConnector | FROM/TO |
| 4 | generic-jdbc-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.jdbc.GenericJdbcConnector | FROM/TO |
+----+------------------------+------------------+------------------------------------------------------+----------------------+
sqoop:000> create link --cid 4
Creating link for connector with id 4
Please fill following values to create new link object
Name: SQL Connect to LocalHost

Link configuration

JDBC Driver Class: com.microsoft.sqlserver.jdbc.SQLServerDriver
JDBC Connection String: jdbc:sqlserver://192.168.56.1:1433
Username: sa
Password: **********
JDBC Connection Properties:
There are currently 0 values in the map:
entry# protocol=tcp
There are currently 1 values in the map:
protocol = tcp
entry#
New link was successfully created with validation status OK and persistent id 1
sqoop:000> create link --cid 3
Creating link for connector with id 3
Please fill following values to create new link object
Name: HDFS connection

Link configuration

HDFS URI: maprfs://maprdemo:8443
Hadoop conf directory: /opt/mapr/hadoop/hadoop-2.7.0/etc/hadoop

There are issues with entered data, please revise your input:
Name: HDFS connection

Link configuration

HDFS URI: maprfs://maprdemo:8443
Error message: Path is not a valid directory
Hadoop conf directory: /opt/mapr/hadoop/hadoop-2.7.0/etc/hadoop
New link was successfully created with validation status OK and persistent id 2
sqoop:000> show link
+----+--------------------------+--------------+------------------------+---------+
| Id | Name | Connector Id | Connector Name | Enabled |
+----+--------------------------+--------------+------------------------+---------+
| 1 | SQL Connect to LocalHost | 4 | generic-jdbc-connector | true |
| 2 | HDFS connection | 3 | hdfs-connector | true |
+----+--------------------------+--------------+------------------------+---------+
sqoop:000> create job -f 1 -t 2
Creating job for links with from id 1 and to id 2
Please fill following values to create new job object
Name: SQL to Hadoop

From database configuration

Schema name: dbo
Table name: tblEmployee
Table SQL statement:
Table column names:
Partition column name:
Null value allowed for the partition column:
Boundary query:

Incremental read

Check column:
Last value:

To HDFS configuration

Override null value:
Null value:
Output format:
0 : TEXT_FILE
1 : SEQUENCE_FILE
Choose: 0
Compression format:
0 : NONE
1 : DEFAULT
2 : DEFLATE
3 : GZIP
4 : BZIP2
5 : LZO
6 : LZ4
7 : SNAPPY
8 : CUSTOM
Choose: 0
Custom compression format:
Output directory: /user/mapr
Append mode:

Throttling resources

Extractors: 1
Loaders: 1
New job was successfully created with validation status OK and persistent id 1

 
sqoop:000> start job --jid 1
Exception has occurred during processing command
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception
Stack trace:
at org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:140)
at org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:199)
at org.apache.sqoop.client.request.JobResourceRequest (JobResourceRequest.java:112)
at org.apache.sqoop.client.request.SqoopResourceRequests (SqoopResourceRequests.java:157)
at org.apache.sqoop.client.SqoopClient (SqoopClient.java:452)
at org.apache.sqoop.shell.StartJobFunction (StartJobFunction.java:80)
at org.apache.sqoop.shell.SqoopFunction (SqoopFunction.java:51)
at org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:149)
at org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:111)
at org.codehaus.groovy.tools.shell.Command$execute (null:-1)
at org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42)
at org.codehaus.groovy.tools.shell.Command$execute (null:-1)
at org.codehaus.groovy.tools.shell.Shell (Shell.groovy:101)
at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:-1)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method (Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)
at groovy.lang.MetaMethod (MetaMethod.java:233)
at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:173)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method (Method.java:606)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:141)
at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:121)
at org.codehaus.groovy.tools.shell.Shell (Shell.groovy:114)
at org.codehaus.groovy.tools.shell.Shell$leftShift$0 (null:-1)
at org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:88)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method (Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)
at groovy.lang.MetaMethod (MetaMethod.java:233)
at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:100)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method (Method.java:606)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:137)
at org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method (Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90)
at groovy.lang.MetaMethod (MetaMethod.java:233)
at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:66)
at java_lang_Runnable$run (null:-1)
at org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:112)
at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:463)
at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:402)
at org.apache.sqoop.shell.SqoopShell (SqoopShell.java:130)
Caused by: Exception: org.apache.sqoop.common.SqoopException Message: GENERIC_HDFS_CONNECTOR_0007:Invalid output directory - Unexpected exception
Stack trace:
at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:71)
at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:35)
at org.apache.sqoop.driver.JobManager (JobManager.java:449)
at org.apache.sqoop.driver.JobManager (JobManager.java:373)
at org.apache.sqoop.driver.JobManager (JobManager.java:276)
at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:380)
at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:116)
at org.apache.sqoop.server.v1.JobServlet (JobServlet.java:96)
at org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:79)
at javax.servlet.http.HttpServlet (HttpServlet.java:646)
at javax.servlet.http.HttpServlet (HttpServlet.java:723)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:604)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter (DelegationTokenAuthenticationFilter.java:277)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:567)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor (Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)
at java.lang.Thread (Thread.java:745)
Caused by: Exception: java.io.IOException Message: Could not create FileClient
Stack trace:
at com.mapr.fs.MapRFileSystem (MapRFileSystem.java:641)
at com.mapr.fs.MapRFileSystem (MapRFileSystem.java:702)
at com.mapr.fs.MapRFileSystem (MapRFileSystem.java:1432)
at com.mapr.fs.MapRFileSystem (MapRFileSystem.java:1056)
at org.apache.hadoop.fs.FileSystem (FileSystem.java:1460)
at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:58)
at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:35)
at org.apache.sqoop.driver.JobManager (JobManager.java:449)
at org.apache.sqoop.driver.JobManager (JobManager.java:373)
at org.apache.sqoop.driver.JobManager (JobManager.java:276)
at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:380)
at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:116)
at org.apache.sqoop.server.v1.JobServlet (JobServlet.java:96)
at org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:79)
at javax.servlet.http.HttpServlet (HttpServlet.java:646)
at javax.servlet.http.HttpServlet (HttpServlet.java:723)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:604)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter (DelegationTokenAuthenticationFilter.java:277)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:567)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor (Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)
at java.lang.Thread (Thread.java:745)

 

 

Any help will be appreciated

 

Thanks

 

Kannu

Outcomes