When hive version is at least 0.11.0 you can execute:
INSERT OVERWRITE LOCAL DIRECTORY '/tmp/directoryWhereToStoreData'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LINES TERMINATED BY "\n"
SELECT * FROM yourTable;
from hive/beeline to store the table into a directory on the local filesystem.
Alternatively, with beeline, save your SELECT query in yourSQLFile.sql and run:
beeline -u 'jdbc:hive2://[databaseaddress]' --outputformat=csv2 -f yourSQlFile.sql > theFileWhereToStoreTheData.csv
Also this will store the result into a file in the local file system.
From hive, to store the data somewhere into HDFS:
CREATE EXTERNAL TABLE output
LIKE yourTable
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
LOCATION 'hfds://WhereDoYou/Like';
INSERT OVERWRITE TABLE output SELECT * from yourTable;
then you can collect the data to a local file using:
hdfs dfs -getmerge /WhereDoYou/Like
This is another option to get the data using beeline only:
env HADOOP_CLIENT_OPTS="-Ddisable.quoting.for.sv=false" beeline -u "jdbc:hive2://your.hive.server.address:10000/" --incremental=true --outputformat=csv2 -e "select * from youdatabase.yourtable"
Working on:
Connected to: Apache Hive (version 1.1.0-cdh5.10.1)
Driver: Hive JDBC (version 1.1.0-cdh5.10.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.10.1 by Apache Hive
--outputformat
doesn't work if you place it after the-e
query or-f
file with query switches. Strange I thought these were named arguments but seems beeline ignores any args after the query. I kept getting the default table format. – Ralph