AnsweredAssumed Answered

Sqoop Incremental Import

Question asked by m_s on Jun 8, 2013
Latest reply on Jul 17, 2013 by nabeel
Trying the Sqoop incremental to load the updated/added records to HDFS on append mode. The mapreduce job picks the expected rows (new rows in the table) and also MapReduce output also gives the expected number of rows. But always it shows as 0 bytes transferred and inserts null record in the hive table (probably creates an empty file for the table). Also, I could see the below message

<pre>
fs.MapRFileSystem: Cannot rename across volumes, falling back on copy/delete semantics
</pre>

<code>

sqoop import --connect <JDBC_URL> --username <USERNAME> --password <PASSWORD> --table \<TABLE\> --split-by ID --hive-table zz_temp --check-column update_date --incremental append --last-value '2013-06-06 18:47:17.0' --target-dir /app/dev/zz_temp
</code>

Any suggestions?

Outcomes