I was trying to transfer large hive table from one of our non-secure cluster into kerberos enabled secure cluster.
since i dont have more temporary space on source server, i was searching some hadoop in built command to support direct file transfer between secure and non-secure.
if i run below command from the secure cluster it will allow the simple auth to connect the non-secure cluster
hdfs dfs -D ipc.client.fallback-to-simple-auth-allowed=true -copyToLocal hdfs://xxx.tanu.com:8020/user/hive/warehouse/tanu.db/tanu_info /user/hive/warehouse/tanu.db/
hdfs dfs -copyFromLocal tanu_info /user/hive/warehouse/tanu.db/
once you import the hive table, run below query in hue hive editor, it wiill rebuild the table from the loaded data.
run below sql query on source hive editor to get the table create statement
show create table tanu_info
then create the tanu_ino table on destination using create statement
once you import the hive data , run below query in hue hive editor, it will rebuild the table from the loaded data.
show create table tanu_info
then create the tanu_ino table on destination using create statement
once you import the hive data , run below query in hue hive editor, it will rebuild the table from the loaded data.
msck repair table tanu_info
No comments:
Post a Comment