Job Wizard Parameter Value
Name mysql-import-job-demo
Type IMPORT
Connection mysql-connection-demo
Table name test
Storage Type HDFS
Output format TEXT_FILE
Output directory /tmp/mysql-import-job-demo
Job wizard form values.
### 3. Save and Submit the Job
At the end of the Job wizard, click “Save and Run”! The job should automagically start after that and the job dashboard will be displayed. As the job is running, a progress bar below the job listing will be dynamically updated. Links to the HDFS output via the File Browser and Map Reduce logs via Job Browser will be available on the left hand side of the job edit page.
# Sum Up
The new Sqoop application enables batch data migration from a more traditional databases to Hadoop and vice versa through Hue. Using Hue, a user can move data between storage systems in a distributed fashion with the click of a button.
I’d like to send out a big thank you to the Sqoop community for the new client-server design!
Both projects are undergoing heavy development and are welcoming external contributions! Have any suggestions? Feel free to tell us what you think through [hue-user][4] or [@gethue][5]!
[1]: http://gethue.com
[2]: http://hadoop.apache.org/
[3]: http://sqoop.apache.org/
[4]: http://groups.google.com/a/cloudera.org/group/hue-user
[5]: https://twitter.com/gethue