Sqoop Syntax error, unexpected tIdentifier - mysql

I'm trying to load MySQL table to hbase using sqoop. I'm using the below command but it is showing unexpected tIdentifier error. Please help.
sqoop import --connect jdbc:mysql://localhost/manideep --username root --password cloudera --table sample --hbase-table H_LOAN --column-family CD --hbase-row-key id -m 1

Using the following command has worked for me to achieve hbase table import in Sgoop:
sqoop import --connect "jdbc:mysql://localhost/manideep" --username <username> --password ****** --table <mysql-table-name> --columns "column1,colmn2,column3" --hbase-table <hbase-table-name> --column-family <hbase-table-column-family> --hbase-row-key <row-key-column-name> -m 1
If the solution does not work, please post the exact issue (exception details) you have faced.

Related

How to use 'create-hive-table' with Sqoop correctly?

I'm trying to import data from MySQL table to Hive using Sqoop. From what I understood there are 2 ways of doing that.
Import data into HDFS and then create External Table in Hive and load data into that table.
Use create-hive-table while running Sqoop query to create a new table in Hive and directly load data into that. I am trying to do this but can't do it for some reason
This is my code
sqoop import \
--connect jdbc:mysql://localhost/EMPLOYEE \
--username root \
--password root \
--table emp \
--m 1 \
--hive-database sqoopimport \
--hive-table sqoopimport.employee \
--create-hive-table \
--fields-terminated-by ',';
I tried using --hive-import as well but got error.
When I ran the above query, job was successful but there was no table created in hive as well as data was stored in \user\HDFS\emp\ location where \HDFS\emp was created during the job.
PS: Also I could not find any reason for using --m 1 with Sqoop. It's just there in all queries.
I got the import working with following query. There is no need to write create-hive-table we can just write new table name with hive-table and that table will be created. Also if there is any issue then go to hive-metastore location and run rm *.lck then try import again.
sqoop import \
--connect jdbc:mysql://localhost/EMPLOYEE \
--username root \
--password root \
--table emp4 \
--hive-import \
--hive-table sqoopimport.emp4 \
--fields-terminated-by "," ;

Sqoop command for import table

I'm using dse 5.0.3 with have sqoop version 1.4.5.15.1 while im importing my data from mysql to cql this error i get
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123
ERROR 13:20:53,886 Imported Failed: Parameter 'directory' is not a directory.
Please help me to solve it.!!
It's difficult to be exact about the problem but I'd suggest that you aren't including any cassandra parameters to the cql-import command is attempting to import to an hdfs directory that also isn't declared. Try including a cassandra-keyspace and cassandra-table on the command. Like:
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123 --cassandra-keyspace npa_nxx --cassandra-table npa_nxx_data
This assumes that the cassandra keyspace and table are correctly setup.
Since this import looks like it is from the dse sqoop demo I'd suggest following the README.txt more closely as it has the correct options for this import in it.

How to import MySQL data to Hadoop file system?

In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.
sqoop import --connect jdbc:mysql://mysql-server-name/db_name --username user --password password --table table_name --target-dir target_directory_name -m1
Hope it helps..
You need to install mysql jdbc/java connector and run sqoop command.
sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar
You can run sqoop command from
Save data into mysql from hive hadoop through sqoop?
1) Install & configure MySQL first. Create database in MySQL
2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.
e.g.
sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1
In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.
Have a look at this article 1 and article 2 for better understanding in step-by-step manner

Sqoop export fails with Mysql

While executing Sqoop Export jobs in Mysql, I'm facing following exception:
No columns to generate for ClassWriter
Cannot load connection class because of underlying exception: 'java.lang.NumberFormatException: For input string: "3306;"'.
Please help to resolve this issue.
In case if you want to export the data from HDFS to your MySQL table then you can use below syntax:
sqoop export \
--connect jdbc:mysql://<HOST_NAME (or) IP_ADDRESS>:<MySQL_PORT_NO>/<DATABASE_NAME> \
--username <DB_USER_NAME> \
--password <DB_PASSWORD> \
--table <YOUR_TABLE_NAME> \
--export-dir <HDFS_PATH_FROM_WHERE_YOU_WANT_TO_EXPORT_DATA>
If I want to export the data from /mysql/user/ which is located in HDFS to MySQL user table then the command will be as follows:
sqoop export \
--connect jdbc:mysql://localhost:3306/test \
--username hadoopuser \
--password **** \
--table user \
--export-dir /mysql/user/

Save data into mysql from hive hadoop through sqoop?

I have my data store into hive table.
i want to transfer hive tables selected data to mysql table using sqoop.
Please guide me how to do this?
check out the sqoop guide here
You need to use sqoop export, here is the example
sqoop export --connect "jdbc:mysql://quickstart.cloudera:3306/retail_rpt_db" \
--username retail_dba \
--password cloudera \
--table departments \
--export-dir /user/hive/warehouse/retail_ods.db/departments \
--input-fields-terminated-by '|' \
--input-lines-terminated-by '\n' \
--num-mappers 2
sqoop export to export data to mysql from Hadoop.
--connect JDBC url
--username mysql username
--password password for mysql user
--table mysql table name
--export-dir valid hadoop directory
--input-fields-terminated-by column delimiter in Hadoop
--input-lines-terminated-by row delimiter in Hadoop
--num-mappers number of mappers to process the data