Import csv to database delimeter issues - mysql

The 3rd column in some row holds semi colon: test 1; test 2; test 3
INSERT INTO `test_table` VALUES ('21', 'test data', 'test 1; test 2; test 3', '', 'Other', 'test data', 'Free ', '')
What should be the setting for delimeter options in phpmyadmin to import in this situation?
My test.csv file is saved in the comma(,) separated format
Tried with column separated with: \t
Throws error as: Invalid column count in CSV input on line 1.

Ignoring insert record error, solved my problem without any delimeter changes.

Related

How to punctuate an INSERT statement that uses FROM_UNIXTIME

In a Python environment, I've got the following variable:
post_time_ms="1581546697000"
which is a Unix-style time with milliseconds.
In my table, "created_date_time" is defined as a datetime column.
I'm trying to use an INSERT statement of the form:
sql_insert_query = "INSERT INTO myTable (id_string, text,
created_date_time) VALUES ('identifier', 'text_content',
FROM_UNIXTIME('post_time/1000')"
I can't figure out how I'm supposed to punctuate that. If I run the query as shown above, I get:
"Failed to insert record 1292 (22007): Truncated incorrect DECIMAL value: 'post_time/1000'
I've tried every variation of single quotes/no quotes I can think of but I always get errors.
For example if I do:
sql_insert_query = "INSERT INTO myTable (id_string, text,
created_date_time) VALUES ('identifier', 'text_content',
FROM_UNIXTIME('post_time'/1000)"
I get:
Failed to insert record 1292 (22007): Truncated incorrect DOUBLE value: 'post_time'
I've gone so far as to try and convert the Unix-style "1581546697000" value as follows:
post_time_mysql = datetime.fromtimestamp(int(post_time)/1000)
and then:
sql_insert_query = "INSERT INTO myTable (id_string, text,
created_date_time) VALUES ('identifier', 'text_content',
'post_time_mysql')"
and even though
print(post_time_mysql)
outputs "2020-02-14 09:25:28",
I still get this error for the above query:
Failed to insert record 1292 (22007): Incorrect datetime value: 'post_time_mysql' for column `myDatabase`.`myTable`.`created_date_time` at row 1
Any ideas/suggestions?
==========
My workaround is to do the following:
sql_insert_query = "INSERT INTO myTable (id_string, text, created_date_time) VALUES (%s, %s, %s)"
sql_insert_data = (identifier, text_content, post_time_mysql)
cursor.execute(sql_insert_query, sql_insert_data)
But I still don't understand how to successfully do it with one query statement - if it's possible.

Update data only if the letter is not in front

My Data:
TAn
Ants
TAr
Arm
TogA
UPDATE sample SET sample_data = REPLACE(sample_data , 'A', 'a');
The above shows my data and the SQL code i am using to change A to a. However i only want to change A to a on if A is not the first letter. How can i accomplish this in MySQL?
Only call REPLACE on SUBSTRING(sample_data, 2)
UPDATE sample
SET sample_data = CONCAT(LEFT(sample_data, 1), REPLACE(SUBSTRING(sample_data, 2), 'A', 'a'))

Need help on composing update query with replacing many string values in mysql

I've field in DB table which has string values as "principaux de l\'asthme : facteurs psychologiques, &eacute-; blabla &eacute tiques &agrave-; blabla".
I want to replace string "&eacute-;" with "é" and "&agrave-;" with "à".
Can I have query for it?
Check With This:
REPLACE(Column_name,'K','SA')
Column_name : Name of your Column
'K' - The characters which you want to replace
'SA' - The Characters with which you want to replace
If you have to repeat this for multiple strings use cascaded Replace function such as
REPLACE(REPLACE(REPLACE(Column_Name, '3', 'test') , '2', 'test') , '1', 'test')

generate ID while Importing .sql file to database

I have a .sql file that I am importing into my database using phpmyadmin. Each time I have been going through the long list of values and changing the ID so it doesn't conflict with another entry. Since I don't care what the ID is, is there any way to have that auto generated?
Example:
INSERT INTO `my_column` (`id`, `valueone`, `valuetwo`) VALUES
(1, 'some value A', 'some value B'),(2, 'some value C', 'some value D'),(3, 'some value E', 'some value F');
So in the above code, I don't want to type in the "1", "2", and "3".
Can I just leave this blank for it to auto generate? Or is there a symbol that I add instead?
Thanks!
Add AUTO_INCREMENT to id column and remove it from the insert Statement:
INSERT INTO `my_column` ( `valueone`, `valuetwo`) VALUES
( 'some value A', 'some value B'),( 'some value C', 'some value D'),(...

Drupal 6 db insert: Strings are getting escaped

Getting driven crazy by this one...
I'm trying to insert a number of rows into a D6 database with a single db_query call. I've collected the rows as a set of strings, and have then collected them into one big string, something like so:
$theData = "(1, 2, 'a'), (3, 4, 'b'), (5, 6, 'c')";
db_query("insert into {table} (int1, int2, str) values %s", $theData);
($theData isn't typed like that in my code; it's the result of the code I've written -- a big string containing sets of values wrapped up in parens.)
When this runs, I get an error like:
You have an error in your SQL syntax; check the manual that corresponds
to your MySQL server version for the right syntax to use near 'insert into
table (int1, int2, str) values (1,2,\'a\' at line 1 query: insert into
table (int1, int2, str) values (1,2,\'a\'),(3,4,\'n\'),(5,6,\'c\')...
So, db_query or somebody else is escaping the strings before passing the values of to mysql. How do I keep this from happening? I could do individual queries for each set of data, but that's wrong/expensive for all the obvious reasons. Thanks!
$theDatas = array("(1, 2, 'a')", "(3, 4, 'b')", "(5, 6, 'c')");
foreach($theDatas as $data) {
db_query("insert into {table} (int1, int2, str) values %s", $data);
}
But it's not recommend to do that, instead of this you should:
$theDatas = array(array(1, 2, 'a'), array(3, 4, 'b'), array(5, 6, 'c'));
foreach($theDatas as $data) {
db_query("insert into {table} (int1, int2, str) values (%d, %d, '%s')", $data[0], $data[1], $data[2]);
}
Or you can serialize($theData) and put it "text" format field as one value, then use unserialize() for restoring array - this way is recommend if you want only store data (no searching, indexing etc).