AnsweredAssumed Answered

Sqoop error in append mode.

Question asked by premg on Jan 15, 2015
Latest reply on Jan 16, 2015 by premg
When running sqoop in append mode the second command fails and throws error.

-- this one runs fine
1) sqoop import --connect jdbc:mysql://abc.com/pg_tmp  --table pg_tmp --username admin_schema --password  zz --direct --num-mappers 8 --target-dir /user/hive/warehouse/mysql/usershard1.db/pg_tmp  --append --where "c1 < 500000" --verbose

-- this one errors out
2)  sqoop import --connect jdbc:mysql://abc.com/pg_tmp  --table pg_tmp --username admin_schema --password  zz --direct --num-mappers 8 --target-dir /user/hive/warehouse/mysql/usershard1.db/pg_tmp  --append --where "c1 > 500000" --verbose

....
....
15/01/16 01:56:24 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 12.3661 seconds (0 bytes/sec)
15/01/16 01:56:24 INFO mapreduce.ImportJobBase: Retrieved 339824 records.
15/01/16 01:56:24 INFO util.AppendUtils: Appending to directory pg_tmp
15/01/16 01:56:24 INFO util.AppendUtils: Using found partition 8
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: _SUCCESS ignored
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00001 repartitioned to: part-m-00008
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00000 repartitioned to: part-m-00009
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00007 repartitioned to: part-m-00010
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00005 repartitioned to: part-m-00011
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00002 repartitioned to: part-m-00012
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00004 repartitioned to: part-m-00013
15/01/16 01:56:24 DEBUG util.AppendUtils: Filename: part-m-00006 repartitioned to: part-m-00014
15/01/16 01:56:24 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Error: Directory not empty
        at com.mapr.fs.MapRFileSystem.rename(MapRFileSystem.java:805)
        at org.apache.sqoop.util.AppendUtils.moveFiles(AppendUtils.java:183)
        at org.apache.sqoop.util.AppendUtils.append(AppendUtils.java:104)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:420)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
----------------
I'm running all commands as root.

Details of system

CentOS release 6.5 (Final)

$ java -version
java version "1.7.0_65"
OpenJDK Runtime Environment (rhel-2.5.1.2.el6_5-x86_64 u65-b17)
OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

$ rpm -qa | grep -i mapr
mapr-zk-internal-3.1.1.26113.GA.v3.4.5-1.x86_64
mapr-nfs-3.1.1.26113.GA-1.x86_64
mapr-hive-0.12.24975-1.noarch
mapr-sqoop-1.4.4.23554-1.noarch
mapr-fileserver-3.1.1.26113.GA-1.x86_64
mapr-zookeeper-3.1.1.26113.GA-1.x86_64
mapr-tasktracker-3.1.1.26113.GA-1.x86_64
mapr-drill-0.7.0.29434-1.noarch
mapr-hiveserver2-0.12.24975-1.noarch
mapr-core-3.1.1.26113.GA-1.x86_64
mapr-cldb-3.1.1.26113.GA-1.x86_64
mapr-hivemetastore-0.12.24975-1.noarch

Outcomes