mysql - storing json, double quote escape issue - mysql

I've been chasing a JSON bug about and discovered that I'm getting different results if I load a file from disk as to loading the (apparently) same file from the db.
Mysql seems to be stealing my escape characters. (I'm using vbscript; my connect string is Driver={MySQL ODBC 5.1 Driver};Server=localhost;Database=foo;User=foo;Password=f00;Option=3;)
On doing a conn.execute(...)
update courses set config = '{"set": "value in \"here\" ok?"}' where id = 21;
select config from courses where id = 21;
// prints changed value {"set": "value in "here" ok?"}
What's going on here? Why is mysql taking out my \" and turning them into "?
If I use workbench on the server (windows 2003) and use the feature "load value from file" in the results pane, I can import the json to the field and it retains the proper escape sequence values. But doing an update / insert, the escape sequence characters are lost.

MySQL does escaping as well. If you need escape characters back when retrieving data, you'll need to escape the escape characters. E.g. "\\".

The issue will solve when you use \\".
But why you have to add an extra \ ??
PHP will treat \" as escape sequence and replace it with "and your final query will be like '{"set": "value in "here" ok?"}'.
By adding extra \ you can build '{"set": "value in \"here\" ok?"}' (PHP will replace \\ with \)

Related

DB2 Export - how is chardel escaped if it is contained in text?

Actually, the title should be self explaining. I give an example:
How will this text column be exported by DB2 Export (with default chardel and coldel)?
The name is "John".
The question
DB2 Load from delimitited Files - escape " in Fields doesn't work mentions double quote as the escape character.
The documentation says nothing about escaping:
https://www.ibm.com/support/knowledgecenter/SSEPGG_9.8.0/com.ibm.db2.luw.admin.cmd.doc/doc/r0008303.html
I also have no access to DB2 to test it myself ...
How are chardel (or coldel) escaped by DB2 export when they occur in text?
It is doubled delimiter:
https://www.ibm.com/support/knowledgecenter/SSEPGG_9.8.0/com.ibm.db2.luw.admin.dm.doc/doc/r0011047.html
I should fully RTFM first..

How to have column with character value equal to the enclosing character value in mysql load data in file

I'm using mysqlimport,which uses LOAD DATA INFILE command. My question is the following: Assume I have --fields-enclosed-by='"', and that I have column with values which have double quoted string, such as "5" object" (which stands for 5 inches). The problem is that when mysql encounter the double quote string after the 5, it treats it as the enclosing character, and things are messed up. How to use mysqlimport with such values? I don't want to just use another character to enclosing, because this other character as well may occur in the data. So what is a general solution for this?
I guess it is will be different this way to import csv.
To solve above issue in another way,
Export or get or convert old data into sql format rather than csv format.
Import the same sql data using mysql command line tool.
mysql -hservername -uusername -p'password' dbname < 'path to you sql imported file.sql'

Should escaped strings be "unescaped" (MySQL, Node.js)?

I am working with the node driver for MySQL by felixge. Following the documentation
https://github.com/felixge/node-mysql#escaping-query-values
strings should be escaped before passed into the MySQL to avoid injection attacs.
My question is: should data be "unescaped" in some way after loaded from MySQL?
Currently, I have a problem with data integrity: I start with a string containing newlines. (printing with console.log(string) shows newlines in the console). After escaping the string, it is saved into a MySQL database. However, after the string is loaded back into memory, a console.log(string) shows escape codes \n instead of newlines.
before passed into the MySQL to avoid injection attacks.
This statement is wrong.
First, strings should be escaped because of syntax rules, not whatever injections.
Second, I hope they have some recipe for the non-strings too.
Should escaped strings be “unescaped” (MySQL)
No.
Escaping is for the query, not database.
shows escape codes \n instead of newlines.
you are escaping your strings twice then

Loading data into MySQL: How to deal with backslashes?

I downloaded a tab-delimited file from a well-known source and now want to upload it into a MySQL table. I am doing this using load data local infile.
This data file, which has over 10 million records, also has the misfortune of many backslashes.
$ grep '\\' tabd_file.txt | wc -l
223212
These backslashes aren't a problem, except when they come at the end of fields. MySQL interprets backslashes as an escape character, and when it comes at the end of the field, it messes up the next field, or possibly the next row.
In spite of these backslashes, I only received 6 warnings from MySQL when loading it into a table. In each of these warnings, a row doesn't have the proper number of columns precisely because the backslash concatenated two adjacent fields in the same row.
My question is, how to deal with these backslashes? Should I specify load data local infile [...] escaped by '' to remove any special meaning from them? Or would this have unintended consequences? I can't think of a single important use of an escape sequence in this data file. The actual tabs that terminate fields are "physical tabs", not "\t" sequences.
Or, is removing the escape character from my load command bad practice? Should I just replace every instance of '\' in the file with '\\'?
Thanks for any advice :-)
If you don't need the escaping, then definitely use ESCAPED BY ''.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
"If the FIELDS ESCAPED BY character is empty, escape-sequence interpretation does not occur. "

Openoffice - CSV-export: is there a default escape-charcter?

As far as I can see OpenOffice, when it comes to save a file as a csv-file, encloses all strings in quote-characters.
So is there any need for an escape character?
and related to this question:
Does OpenOffice have a default escape character?
I'm also wondering if there is a way to choose the escape character when saving OpenOffice as csv. phpmyadmin was not accepting a 9,000 line 50+ column spreadsheed in .ods format and there doesn't seem to be a way to choose the escape character when saving as CSV.
So I had to save as csv, open in word, and use some find/replace tricks to change the escape character to \ (back slash). Default is to use double quotes to escape double quotes, and phpmyadmin won't accept that format.
To properly convert the file to use \ (back-slash) to escape double-quotes, you have to do this:
Pick a placeholder character string, e.g. 'abcdefg', that does
not occur anywhere in the csv.
Find/replace """ (three double-quotes in a row) with the placeholder. This is to prevent possibly incorrect results in the next step.
Find/replace "" (two quotes in a row, representing one quote that should be escaped), with \" (back-slash double-quote). If you did this without find/replacing """ it's conceivable you could get a result like "\" instead of \"". Better safe than sorry.
Find/replace the placeholder string with \"" (back-slash double-quote double-quote).
That will work, unless you happen to have more than one double-quote in a row in your original text fields, which would result in as many as five double-quotes in a row in the resulting .ods or .xlsx csv file (two double-quotes for each escaped double quote, plus another double quote if its at the end of the field).
Escaping in quotes makes life easier for tools parsing the CSV file.
In a recent version of LibreOffice (3.4.4), the CSV export was not handled correctly by phpMyAdmin. Since LibreOffice doesn't provide an escape character, the phpMyAdmin's default "CSV" import feature "Columns escaped with:" didn't work well. The data was always inconsistent.
However, using the option CSV using LOAD DATA did work, only if the value in Columns escaped by option was removed. I presume phpMyAdmin uses the default MySQL LOAD DATA command, and thus the control is passed to MySQL for data processing. In my scenario it resulted in accurate data import.