inserting data including slashes into mysql - mysql

I have some small data with slashes in it.
I am doing a csv import with over 2000 lines. I am looping through each one and it goes fine but when I have slashes in the text the whole string wont import. I dont get any errors. The data just doesn't show up in the database.
Sample text would be "Barr/Massive"
How can I make it so mysql doesn't strip the slash.

Use bind parameters, don't insert the data directly into your SQL.
INSERT INTO table VALUES ($name)
versus
INSERT INTO table VALUES (?)
I'm assuming you're using some kind of PHP or Perl script for this. Otherwise, phpMyAdmin has a CSV import.

Related

Convert PostgreSQL bytea column to MySql blob

I am migrating a database from PostgresSql to MySql.
We were saving files in the database as PostgreSQL bytea columns. I wrote a script to export the bytea data and then insert the data into a new MySql database as a blob. The data is inserting into Mysql fine, but it is not working at the application level. However, the application should not care, as the data is exactly the same. I am not sure what is wrong, but I feel like it is some difference between MySql vs. PostgreSQL. Any help would be greatly appreciated.
This could really be a number of issues, but I can provide some tips in regards to converting binary data between sql vendors.
The first thing you need to be aware of is that each sql database vendor uses different escape characters. I suspect that your binary data export is using hex and you most likely have unwanted escape characters when you import to your new database.
I recently had to do this. The exported binary data was in hex and vendor specific escape characters were included.
In your new database, check if the text value of the binary data starts with an 'x' or unusual encoding. If it does you need to get rid of this. Since you already have the data inserting properly, to test, you can just write an sql script to remove any unwanted vendor specific escape characters from each imported binary data record in your new database. Finally, you may need to unhex each each new record.
So, something like this worked for me:
UPDATE my_example_table
SET my_blob_column = UNHEX(SUBSTRING(my_blob_column, 2, CHAR_LENGTH(my_blob_column)))
Note: The 2 in the SUBSTRING function is because the export script
was using hex and prepending '\x' as a vendor specific escape character.
I am not sure that will work for you, but it maybe worth a try.

Wordpress $wpdb->prepare() put slashes in the inserted values?

I just noticed that $wpdb->prepare() put slashes in the inserted values while it shouldn't cause this! e.g.: if you insert 'test' as a value it ends up as \'test\' in its table field.
How can I reliably remove those slashes when retrieving data from DB?
It escapes some special characters before storing them on your data base. You can use function stripslashes on your data after reading, to restore it.

Excel SQL Server Data Connection

Perhaps someone could provide some insight into a problem I have.
I have a SQL Server database which receives information every hour and is updated from a stored procedure using a Bulk Insert. This all works fine, however the end result is to pull this information into Excel.
Establishing the data connection worked fine as well, until I attempted some calculations. The imported data is all formatted as text. Excel's number formats aren't working so I decided looking at the table in the database.
All the columns are set to varchar for the Bulk Insert to work so I changed a few to numeric. Refreshed in Excel and the calculations worked.
After repeat attempts I've not been able to get the Bulk Insert to work, even generating a format file with bcp it still returned errors on the insert. Could not convert varchar to numerical, after some further searching it was only failing on one numerical column which is generally empty.
Other than importing the data with VBA and converting it like that or adding zero to every imported value so Excel converts it.
Any suggestions are welcome.
Thanks!
Thanks for the replies I had considered using =value() in Excel but wanted to try avoid the additional formulas.
I was eventually able to resolve my problem by generating a format file for the Bulk Insert using the bcp utility. Though getting it to generte a file proved tricky enough below is an example of how I generated it.
At an elevated cmd:
C:\>bcp databasename.dbo.tablename format nul -c -x -f outputformatfile.xml -t, -S localhost\SQLINSTANCE -T
This generated an xml format file for the specific table. As my table had two additional columns which weren't in the source data I edited the XML and removed them. They were uniqueid and getdate columns.
Then I changed the Bulk Insert statement so it used the format file:
BULK INSERT [database].[dbo].[tablename]
FROM 'C:\bulkinsertdata.txt'
WITH (FORMATFILE='C:\outputformatfile.xml',FIRSTROW=3)
Using this method I was able to use the numeric and int datatypes successfully. Going back to Excel when the data connection was refreshed it was able to determine the correct datatypes.
Hope that helps someone!

job results to a txt file in sql server

I have set up a job within SSMS to run everyday where it executes queries against two tables and stores the results in .csv text files in a folder. It works fine but I would like the files to be comma delimited text files and it is storing it with spaces between the columns and this creates unnecessary increase in size..
I went to tool/options etc and set the sql query text file format to comma delimited and it works fine when I run a manual query and select to save the results but not within the actual job. Any suggestions
In fact if I could get the results from bot queries to store into one text file only that would be even better.
I think you may want to use
SSIS and send the output to a file in CSV format
BCP
SQLCMD

How to load data into a MySQL table without spaces?

I have a text file to be imported in a MySQL table. The columns of the files are comma delimited. I set up an appropriate table and I used the command:
load data LOCAL INFILE 'myfile.txt' into table mytable FIELDS TERMINATED BY ‘,’;
The problem is, there are several spaces in the text file, before and after the data on each column, and it seems that the spaces are all imported in the tables (and that is not what I want). Is there a way to load the file without the empty spaces (other than processing each row of the text file before importing in MySQL)?
As far as I understand, there's no way to do this during the actual load of the data file dynamically (I've looked, as well).
It seems the best way to handle this is to either use the SET clause with the TRIM
function
("SET column2 = TRIM(column2)")
or run an update on the string columns after loading, using the TRIM() function.
You can also create a stored procedure using prepared statements to run the TRIM function on all columns in a specified table, immediately after loading it.
You would essentially pass in the table name as a variable, and the sp would use the information_schema database to determine which columns to upload.
If you can use .NET, CSVReader is a great option(http://www.codeproject.com/KB/database/CsvReader.aspx). You can read data from a CSV and specify delimiter, trimming options, etc. In your case, you could choose to trim left and right spaces from each value. You can then either save the result to a new text file and import it into the database, or loop through the CsvReader object and insert each row into the database directly. The performance of CsvReader is impressive. Hope this helps.