How to write hive queries to csv file using Beeline ranger - csv

I am new to hive and I am trying to run a hive query using Putty and I want the output in CSV format without over writing files in the directory. I used following query
echo `beeline-ranger --outputformat=tsv2 -e 'select distinct xyz from database.table;' > /C:/Users/name/Documents/TBA /sample_${TODAY}.tsv`
I am trying to run this in putty hive environment but I am getting below error
ParseException line 1:59 character '<EOF>' not supported here
What is wrong with the code, thanks in advance.

This can help you with a solution
beeline -u 'jdbc:hive2://[databaseaddress]' --outputformat=csv2 -e 'select * from table' > theFileWhereToStoreTheData.csv
OR
beeline -u 'jdbc:hive2://[databaseaddress]' --outputformat=csv2 -f yourSQlFile.sql > theFileWhereToStoreTheData.csv

Related

Add date in Mysqldump when running cmd from Java and Shell

I would like to append date in file name when taking backup using mysqldump. I am storing the command in properties file and running it via ProcessBuilder and shell script. I have tried multiple ways to add the date (BTW all the answers here were only if we run the command directly in linux)
mysqldump -u <user> -p <database> | gzip > <backup>$(date +%Y-%m-%d-%H.%M.%S).sql.gz
Got the error: No table found for "+%Y-%m-%d-%H.%M.%S"
mysqldump -u root -ppassword dbName --result-file=/opt/backup/`date -Iminutes`.dbName.sql
Got the error: unknow option -I
Is there a way around for this to add date in the command itself? I cannot append the date in java method or shell script.
I can't tell from your question whether you're running in a shell. If so, try these three lines to generate your backup file.
DATE=`date +%Y-%m-%d-%H.%M.%S`
FILENAME=backup${DATE}.sql.gz
mysqldump -u user -p database | gzip > ${FILENAME}
Notice how you should surround the date command in the first line with backticks, not ${}, to get its result into the DATE shell variable.

Bash Scripting for inseart .csv file in mysql with particular columns [duplicate]

I want to make a bash script that connects to my MySQL server and inserts some valuse from a txt file.
I have written this down:
#!/bin/bash
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('cat test.txt');" | mysql -uroot -ptest test;
but I'm recieving the following error:
ERROR 1136 (21S01) at line 1: Column count doesn't match value count
at row 1
I suppose the error is in my txt file, but I've tried many variations and still no hope of success.
My txt file looks like this:
10.16.54.29 00:f8:e5:33:22:3f marsara
Try this one:
#!/bin/bash
inputfile="test.txt"
cat $inputfile | while read ip mac server; do
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('$ip', '$mac', '$server');"
done | mysql -uroot -ptest test;
This way you streaming the file read as well the mysql comand execution.
Assuming you have many rows to add, you probably need LOAD DATA INFILE statement, not INSERT. The source file has to be on the server, but it seems to be the case here.
Something like that:
#!/bin/bash
mysql -uroot -ptest test << EOF
LOAD DATA INFILE 'test.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ' ';
EOF
LOAD DATA INFILE has many options as you will discover by reading the doc.
You are trying to insert the value "cat test.txt" as a String in the database in an INSERT statement that requires 3 parameters (IP,MAC and SERVER) so this is why you get this error message.
You need to read the text file first and extract the IP, MAC and Server values and then use these in the query that would look like this once filled :
#!/bin/bash
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('10.16.54.29', '00:f8:e5:33:22:3f', 'marsara');" | mysql -uroot -ptest test;
I use this and it works:
mysql -uroot -proot < infile
or select the database first
./mysql -uroot -proot db_name < infile
or copy the whole SQL into the clipboard and paste it with
pbpaste > temp_infile && mysql -uroot -proot < temp_infile && rm temp_infile
#!/bin/bash
username=root
password=root
dbname=myDB
host=localhost
TS=$(date +%s)
echo $1
mysql -h$host -D$dbname -u$username -p$password -e"INSERT INTO dailyTemp (UTS, tempF) VALUES ($TS, $1);"
exit 0

Export as csv in beeline hive

I am trying to export my hive table as a csv in beeline hive. When I run the command !sql select * from database1 > /user/bob/output.csv it gives me syntax error.
I have successfully connected to the database at this point using the below command. The query outputs the correct results on console.
beeline -u 'jdbc:hive2://[databaseaddress]' --outputformat=csv
Also, not very clear where the file ends up. It should be the file path in hdfs correct?
When hive version is at least 0.11.0 you can execute:
INSERT OVERWRITE LOCAL DIRECTORY '/tmp/directoryWhereToStoreData'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LINES TERMINATED BY "\n"
SELECT * FROM yourTable;
from hive/beeline to store the table into a directory on the local filesystem.
Alternatively, with beeline, save your SELECT query in yourSQLFile.sql and run:
beeline -u 'jdbc:hive2://[databaseaddress]' --outputformat=csv2 -f yourSQlFile.sql > theFileWhereToStoreTheData.csv
Also this will store the result into a file in the local file system.
From hive, to store the data somewhere into HDFS:
CREATE EXTERNAL TABLE output
LIKE yourTable
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
LOCATION 'hfds://WhereDoYou/Like';
INSERT OVERWRITE TABLE output SELECT * from yourTable;
then you can collect the data to a local file using:
hdfs dfs -getmerge /WhereDoYou/Like
This is another option to get the data using beeline only:
env HADOOP_CLIENT_OPTS="-Ddisable.quoting.for.sv=false" beeline -u "jdbc:hive2://your.hive.server.address:10000/" --incremental=true --outputformat=csv2 -e "select * from youdatabase.yourtable"
Working on:
Connected to: Apache Hive (version 1.1.0-cdh5.10.1)
Driver: Hive JDBC (version 1.1.0-cdh5.10.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.10.1 by Apache Hive
You can use this command to save output in CSV format from beeline:
beeline -u 'jdbc:hive2://bigdataplatform-dev.nam.nsroot.net:10000/;principal=hive/bigdataplatform-dev.net#NAMUXDEV.NET;ssl=true' --outputformat=csv2 --verbose=false --fastConnect=true --silent=true -f $query_file>out.csv
Save your SQL query file into $query_file.
Result will be in out.csv.
I have complete example here: hivehoney
Following worked for me
hive --silent=true --verbose=false --outputformat=csv2 -e "use <db_name>; select * from <table_name>" > table_name.csv
One advantage over using beeline is that you don't have have to provide hostname or user/pwd if you are running on hive node.
When some of the columns have string values having commas, tsv (tab separated) works better
hive --silent=true --verbose=false --outputformat=tsv -e "use <db_name>; select * from <table_name>" > table_name.tsv
Output format in CSV:
$ beeline -u jdbc:hive2://192.168.0.41:10000/test_db -n user1 -p password **--outputformat=csv2** -e "select * from t1";
Output format in custom delimiter:
$ beeline -u jdbc:hive2://192.168.0.41:10000/test_db -n user1 -p password **--outputformat=dsv** **--delimiterForDSV='|'** -e "select * from t1";
Running command in background and redirect out to file:
$nohup `$ beeline -u jdbc:hive2://192.168.0.41:10000/test_db -n user1 -p password --outputformat=csv2 -e "select * from t1"; > output.csv 2> log` &
Reference URLs:
https://dwgeek.com/export-hive-table-into-csv-format-using-beeline-client-example.html/
https://dwgeek.com/hiveserver2-beeline-command-line-shell-options-examples.html/
From Beeline
beeline -u 'jdbc:hive2://123.12.4132:345/database_name' --outputformat=csv2 -e "select col1, col2, col3 from table_name" > /path/to/dump.csv

Bash script to insert values in MySQL

I want to make a bash script that connects to my MySQL server and inserts some valuse from a txt file.
I have written this down:
#!/bin/bash
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('cat test.txt');" | mysql -uroot -ptest test;
but I'm recieving the following error:
ERROR 1136 (21S01) at line 1: Column count doesn't match value count
at row 1
I suppose the error is in my txt file, but I've tried many variations and still no hope of success.
My txt file looks like this:
10.16.54.29 00:f8:e5:33:22:3f marsara
Try this one:
#!/bin/bash
inputfile="test.txt"
cat $inputfile | while read ip mac server; do
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('$ip', '$mac', '$server');"
done | mysql -uroot -ptest test;
This way you streaming the file read as well the mysql comand execution.
Assuming you have many rows to add, you probably need LOAD DATA INFILE statement, not INSERT. The source file has to be on the server, but it seems to be the case here.
Something like that:
#!/bin/bash
mysql -uroot -ptest test << EOF
LOAD DATA INFILE 'test.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ' ';
EOF
LOAD DATA INFILE has many options as you will discover by reading the doc.
You are trying to insert the value "cat test.txt" as a String in the database in an INSERT statement that requires 3 parameters (IP,MAC and SERVER) so this is why you get this error message.
You need to read the text file first and extract the IP, MAC and Server values and then use these in the query that would look like this once filled :
#!/bin/bash
echo "INSERT INTO test (IP,MAC,SERVER) VALUES ('10.16.54.29', '00:f8:e5:33:22:3f', 'marsara');" | mysql -uroot -ptest test;
I use this and it works:
mysql -uroot -proot < infile
or select the database first
./mysql -uroot -proot db_name < infile
or copy the whole SQL into the clipboard and paste it with
pbpaste > temp_infile && mysql -uroot -proot < temp_infile && rm temp_infile
#!/bin/bash
username=root
password=root
dbname=myDB
host=localhost
TS=$(date +%s)
echo $1
mysql -h$host -D$dbname -u$username -p$password -e"INSERT INTO dailyTemp (UTS, tempF) VALUES ($TS, $1);"
exit 0

How do I import data from a .bat file in MySQL?

I created a .bat file (import_file.bat):
set database_name=nome_db
mysql –u root --password=pass --database %database_name% < c:/import_geco/sql_svuta.sql
mysql –u root --password=pass --database %database_name% < c:/import_geco/carica_dati.sql
From the command line, in the MySQL bin directory I entered:
mysql/bin>c:/import_db/import_file.bat
... but it doesn't work, instead it returns MySQL help info.
If I create a .bat file to export table (export.bat):
mysqldump --no-create-info -u root nome_db nome_tabella > c:/backup_db/export.sql
... and enter at the command line:
mysql/bin>c:/import_db/export.bat
it works.
You get the help file because you have a malformed command. You are attempting to pipe something to the command line. YOu probably need to use the -e (--execute) command, and you can either pipe the file or use LOAD DATA FILE
see: http://dev.mysql.com/doc/refman/5.5/en/mysql-command-options.html