How do I handle unescaped MySQL data in a SELECT query? - mysql

Background:
User submits the following data firstName
I escape user's data using mysql_real_escape_string and store it in MySQL memory table (table1).
A PHP cron runs every 5 minutes then SELECTS this data into a PHP array (for replication reasons) and executes the following INSERT command to an identical table but in InnoDB format (table2):
INSERT INTO `table2` (`id`,`firstName`) VALUES ('1','aaa');
Problem:
If a user sends data with a single quote ', i.e, "John's", it is escaped WHILE saving, but not saved escaped. Meaning, it's saved with the single quote into firstName of table1.
When the above insert command takes place, the unescaped data breaks the whole insert command. How do I deal with this without manually escaping at every juncture?
I can't shift to PDO or mysqli at the moment.

You cannot avoid this without manually escaping data everytime you use it in a literal SQL query. Either use parametrized statements or escape your data.
What do you mean with your "replication reasons"? Maybe you don't need to fetch all data from your database, just so you can push it back into another table. Inserting directly from another table is much more efficient.

Related

SQL query fails with DB fields strings containing apostrophe

I'm querying a MySQL DB with some strings in one field which contain apostrophes which I cannot remove or escape when adding to the DB. How do I format a query not to fail on a string containing an apostrophe? For example, doing a query against a FULLTEXT indexed field:
"SELECT * FROM NationalTideStationsA WHERE MATCH(location) AGAINST('$myState')";
This fails whenever the returned string has an apostrophe, for example, when the location field contains:
"Cedar Tree Neck, Martha's Vineyard, Massachusetts"
all queries for locations in Massachusetts fail.
I cannot work out if SQL offers a way to format the query to cope with that.
The SELECT query works just as desired otherwise.
Agreed on the suggestion to read up on sql injection. For the immediate, replace all single quotes with two single quotes.

Firefox add-on: Populating sqlite database with a lot of data (around 80000 rows) not working with executeSimpleSQL()

The firefox add-on that I am trying to code needs a big database.
I was advised not to load the database itself from the 'data' directory (using the addon-sdk to develop locally on my linux box).
So I decided to get the content from a csv file and insert it into the database that I created.
The thing is that the csv has about 80 000 rows and I get an error when I try to pass .executeSimpleSQL() the reaaaaally long INSERT statement as a string
('insert into table
values (row1val1,row1val2,row1val3),
(row2val1,row2val2,row2val3),
...
(row80000val1,row80000val2,row80000val3)')
Should I insert asynchronously? Use prepared statements?
Should I consider another approach, loading the database as an sqlite file directly?
You may be crossing some sqlite limits.
From sqlite Implementation Limits:
Maximum Length Of An SQL Statement
The maximum number of bytes in the text of an SQL statement is limited
to SQLITE_MAX_SQL_LENGTH which defaults to 1000000. You can redefine
this limit to be as large as the smaller of SQLITE_MAX_LENGTH and
1073741824.
If an SQL statement is limited to be a million bytes in length, then
obviously you will not be able to insert multi-million byte strings by
embedding them as literals inside of INSERT statements. But you should
not do that anyway. Use host parameters for your data. Prepare short
SQL statements like this:
INSERT INTO tab1 VALUES(?,?,?);
Then use the sqlite3_bind_XXXX()
functions to bind your large string values to the SQL statement. The
use of binding obviates the need to escape quote characters in the
string, reducing the risk of SQL injection attacks. It is also runs
faster since the large string does not need to be parsed or copied as
much.
The maximum length of an SQL statement can be lowered at run-time
using the sqlite3_limit(db,SQLITE_LIMIT_SQL_LENGTH,size) interface.
You cannot use that many records in a single INSERT statement;
SQLite limits the number to its internal parameter SQLITE_LIMIT_COMPOUND_SELECT, which is 500 by default.
Just use multiple INSERT statements.

How to insert data to mysql directly (not using sql queries)

I have a MySQL database that I use only for logging. It consists of several simple look-alike MyISAM tables. There is always one local (i.e. located on the same machine) client that only writes data to db and several remote clients that only read data.
What I need is to insert bulks of data from local client as fast as possible.
I have already tried many approaches to make this faster such as reducing amount of inserts by increasing the length of values list, or using LOAD DATA .. INFILE and some others.
Now it seems to me that I've came to the limitation of parsing values from string to its target data type (doesn't matter if it is done when parsing queries or a text file).
So the question is:
does MySQL provide some means of manipulating data directly for local clients (i.e. not using SQL)? Maybe there is some API that allow inserting data by simply passing a pointer.
Once again. I don't want to optimize SQL code or invoke the same queries in a script as hd1 adviced. What I want is to pass a buffer of data directly to the database engine. This means I don't want to invoke SQL at all. Is it possible?
Use mysql's LOAD DATA command:
Write the data to file in CSV format then execute this OS command:
LOAD DATA INFILE 'somefile.csv' INTO TABLE mytable
For more info, see the documentation
Other than LOAD DATA INFILE, I'm not sure there is any other way to get data into MySQL without using SQL. If you want to avoid parsing multiple times, you should use a client library that supports parameter binding, the query can be parsed and prepared once and executed multiple times with different data.
However, I highly doubt that parsing the query is your bottleneck. Is this a dedicated database server? What kind of hard disks are being used? Are they fast? Does your RAID controller have battery backed RAM? If so, you can optimize disk writes. Why aren't you using InnoDB instead of MyISAM?
With MySQL you can insert multiple tuples with one insert statement. I don't have an example, because I did this several years ago and don't have the source anymore.
Consider as mentioned to use one INSERT with multiple values:
INSERT INTO table_name (col1, col2) VALUES (1, 'A'), (2, 'B'), (3, 'C'), ( ... )
This leads to you only having to connect to your database with one bigger query instead of several smaller. It's easier to take in the entire couch through the door once than running back and forth with all disassembled pieces of the couch, opening the door every time. :)
Apart from that, you can also run LOCK TABLES table_name WRITE before INSERT and UNLOCK TABLES afterwards. That will secure that nothing else is inserted during.
Lock tables
INSERT into foo (foocol1, foocol2) VALUES ('foocol1val1', 'foocol2val1'),('foocol1val2','foocol2val2') and so on should sort you. More information and sample code will be found here. If you have further problems, do leave a comment.
UPDATE
If you don't want to use SQL, then try this shell script to do as many inserts as you want, put it in a file, say insertToDb.sh, and get on with your day/evening:
#!/bin/sh
mysql --user=me --password=foo dbname -h foo.example.com -e "insert into tablename (col1, col2) values ($1, $2);"
Invoke as sh insertToDb.sh col1value col2value. If I've still misunderstood your question, leave another comment.
After making some investigation I found no way of passing data directly to mysql database engine (without parsing it).
My aim was to speed up communication between local client and db server as much as possible. The idea was if client is local then it could use some api functions to pass data to db engine thus not using (i.e. parsing) SQL and values in it. The only closest solution was proposed by bobwienholt (using prepared statement and binding parameters). But LOAD DATA .. INFILE appeared to be a bit faster in my case.
The best way to insert data on MS SQL without using insert into or update queries is just to access MS SQL Interface. Right click on the table name and select "Edit top 200 rows". Then you will be able to add data on the database directly by just typing per cell. For you to enable searching or using select or other sql commands just right click on any of the 200 rows you have selected. Go to pane then select SQL and you can add sql command. Check it out. :D
without using insert statement , use " Sqllite Studio " for inserting data in mysql. It's free and open source so u can download and check.

inserting data including slashes into mysql

I have some small data with slashes in it.
I am doing a csv import with over 2000 lines. I am looping through each one and it goes fine but when I have slashes in the text the whole string wont import. I dont get any errors. The data just doesn't show up in the database.
Sample text would be "Barr/Massive"
How can I make it so mysql doesn't strip the slash.
Use bind parameters, don't insert the data directly into your SQL.
INSERT INTO table VALUES ($name)
versus
INSERT INTO table VALUES (?)
I'm assuming you're using some kind of PHP or Perl script for this. Otherwise, phpMyAdmin has a CSV import.

How to load data into a MySQL table without spaces?

I have a text file to be imported in a MySQL table. The columns of the files are comma delimited. I set up an appropriate table and I used the command:
load data LOCAL INFILE 'myfile.txt' into table mytable FIELDS TERMINATED BY ‘,’;
The problem is, there are several spaces in the text file, before and after the data on each column, and it seems that the spaces are all imported in the tables (and that is not what I want). Is there a way to load the file without the empty spaces (other than processing each row of the text file before importing in MySQL)?
As far as I understand, there's no way to do this during the actual load of the data file dynamically (I've looked, as well).
It seems the best way to handle this is to either use the SET clause with the TRIM
function
("SET column2 = TRIM(column2)")
or run an update on the string columns after loading, using the TRIM() function.
You can also create a stored procedure using prepared statements to run the TRIM function on all columns in a specified table, immediately after loading it.
You would essentially pass in the table name as a variable, and the sp would use the information_schema database to determine which columns to upload.
If you can use .NET, CSVReader is a great option(http://www.codeproject.com/KB/database/CsvReader.aspx). You can read data from a CSV and specify delimiter, trimming options, etc. In your case, you could choose to trim left and right spaces from each value. You can then either save the result to a new text file and import it into the database, or loop through the CsvReader object and insert each row into the database directly. The performance of CsvReader is impressive. Hope this helps.