ThinkChat2.0新版上线,更智能更精彩,支持会话、画图、阅读、搜索等,送10W Token,即刻开启你的AI之旅 广告
**查看关于Hive的参数** <hr/> ```sql [root@hadoop101 /]# sqoop import --help Hive arguments: --create-hive-table 自动在hive中创建表,如果表已经存在则报错,生产环境中一般不用 --hive-database <database-name> 指定Hive的数据库 --hive-delims-replacement <arg> Replace Hive record \0x01 and row delimiters (\n\r) from imported string fields with user-defined string --hive-drop-import-delims Drop Hive record \0x01 and row delimiters (\n\r) from imported string fields --hive-home <dir> Override $HIVE_HOME --hive-import 导入数据到Hive中 (Uses Hive's default delimiters if none are set.) --hive-overwrite 覆盖表原有数据 --hive-partition-key <partition-key> 分区字段 --hive-partition-value <partition-value> 分区值 --hive-table <table-name> 导入数据到Hive的哪张表上 --map-column-hive <arg> Override mapping for specific column to hive types. ``` <br/> **导入数据到Hive表** <hr/> ```sql sqoop import \ --connect jdbc:mysql://hadoop101:3306/sqoop_db \ --table orders \ --username root \ --password 123456 \ --hive-import \ --create-hive-table \ --hive-database h_sqoop_db \ --hive-table orders \ -m 3 ``` <br/> **导入到分区** <hr/> ```sql # 虽然指定--target-dir,但是并会创建该目录 # --hive-table h_sqoop_db.orders_part相当于 # --hive-database h_sqoop_db # --hive-table orders_part sqoop import \ --connect jdbc:mysql://hadoop101:3306/sqoop_db \ --query "select order_id, order_status from orders where order_date>='2014-07-24' and order_date<'2014-07-26' and \$CONDITIONS" \ --username root \ --password 123456 \ --target-dir /user/data/orders \ --split-by order_status \ --hive-import \ --hive-table h_sqoop_db.orders_part \ --hive-partition-key "order_date" \ --hive-partition-value "20140724" \ -m 3 ```