In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.
sqoop import --connect jdbc:mysql://mysql-server-name/db_name --username user --password password --table table_name --target-dir target_directory_name -m1
Hope it helps..
You need to install mysql jdbc/java connector and run sqoop command.
sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar
You can run sqoop command from
Save data into mysql from hive hadoop through sqoop?
1) Install & configure MySQL first. Create database in MySQL
2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.
e.g.
sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1
In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.
Have a look at this article 1 and article 2 for better understanding in step-by-step manner
Related
I am trying to load data from MySQL to HDFS by using below query:
$ sqoop import --connect jdbc:mysql://127.0.0.1:3306/userdb \
--username root --password pwd \
--driver com.mysql.jdbc.Driver --table import-all-tables \
--m 1 --target-dir /user/cloudera/foldername
I get a "communications link failure error".
MYSQL Source: Installed in my local computer
HDFS Destination: Cloudera in virtual machine
Please help me how to load data from my local computer to cloudera HDFS .
I'm trying to load MySQL table to hbase using sqoop. I'm using the below command but it is showing unexpected tIdentifier error. Please help.
sqoop import --connect jdbc:mysql://localhost/manideep --username root --password cloudera --table sample --hbase-table H_LOAN --column-family CD --hbase-row-key id -m 1
Using the following command has worked for me to achieve hbase table import in Sgoop:
sqoop import --connect "jdbc:mysql://localhost/manideep" --username <username> --password ****** --table <mysql-table-name> --columns "column1,colmn2,column3" --hbase-table <hbase-table-name> --column-family <hbase-table-column-family> --hbase-row-key <row-key-column-name> -m 1
If the solution does not work, please post the exact issue (exception details) you have faced.
I'm using dse 5.0.3 with have sqoop version 1.4.5.15.1 while im importing my data from mysql to cql this error i get
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123
ERROR 13:20:53,886 Imported Failed: Parameter 'directory' is not a directory.
Please help me to solve it.!!
It's difficult to be exact about the problem but I'd suggest that you aren't including any cassandra parameters to the cql-import command is attempting to import to an hdfs directory that also isn't declared. Try including a cassandra-keyspace and cassandra-table on the command. Like:
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123 --cassandra-keyspace npa_nxx --cassandra-table npa_nxx_data
This assumes that the cassandra keyspace and table are correctly setup.
Since this import looks like it is from the dse sqoop demo I'd suggest following the README.txt more closely as it has the correct options for this import in it.
I am running the Sqoop demo from the DataStax Enterprise 5.0 and i have followed every instruction and they seem to work fine. But when i run the import from the sql to cql i get the exception NoHostAvailableException. Please i need help. Thanks
This is my import options:
cql-import
--table
npa_nxx
--cassandra-keyspace
npa_nxx
--cassandra-table
npa_nxx_data
--cassandra-column-mapping
npa:npa,nxx:nxx,latitude:lat,longitude:lon,state:state,city:city
--connect
jdbc:mysql://127.0.0.1:3306/npa_nxx_demo
--username
root
--password
xxxx
--cassandra-host
127.0.0.1
After i run this i get this errors (imageException)
instead of 127.0.0.1 use hostname: you can get hostname by this command: hostname -f
I have my data store into hive table.
i want to transfer hive tables selected data to mysql table using sqoop.
Please guide me how to do this?
check out the sqoop guide here
You need to use sqoop export, here is the example
sqoop export --connect "jdbc:mysql://quickstart.cloudera:3306/retail_rpt_db" \
--username retail_dba \
--password cloudera \
--table departments \
--export-dir /user/hive/warehouse/retail_ods.db/departments \
--input-fields-terminated-by '|' \
--input-lines-terminated-by '\n' \
--num-mappers 2
sqoop export to export data to mysql from Hadoop.
--connect JDBC url
--username mysql username
--password password for mysql user
--table mysql table name
--export-dir valid hadoop directory
--input-fields-terminated-by column delimiter in Hadoop
--input-lines-terminated-by row delimiter in Hadoop
--num-mappers number of mappers to process the data