Insert image into binary field MySQL, NodeJS - mysql

First of all, i understand that storing images into DB is a bad practice, however i need to do it.
I have a table in MySQL (..., img BINARY, ..) and have some images on my machine.
How should I transform my input data, to make it avaliable store in BINARY field? And how to decode it into original format again?

Related

VARCHAR(MAX) vs TEXT vs .txt file for use in MySQL database

I tried to google this, but any results I found were related to importing data from a txt file to populate the database as opposed to storing data.
To me, it seems strange that the contents of a file should be stored in a database. We're working on building an eCommerce site, and each item has a description. I assumed the standard would be to store the description in a txt file and the URL in the database, and not to store the huge contents in the database to keep the file size low and speeds high. When you need to store images in a database, you reference it using a URL instead of storing all the pixel data - why would text be any different?
That's what I thought, but everyone seems to be arguing about VARCHAR vs TEXT, so what really is the best way to store text data up to 1000 characters or so?
Thanks!
Whether you store long text data or image data in a database or in external files has no right or wrong answer. There are pros and cons on both sides—despite many developers making unequivocal claims that you should store images outside the database.
Consider you might want the text data to:
Allow changes to be rolled back.
Support transaction isolation.
Enforce SQL access privileges to the data.
Be searchable in SQL when you create a fulltext index.
Support the NOT NULL constraint, so your text is required to contain something.
Automatically be included when you create a database backup (and the version of the text is the correct version, assuring consistency with other data).
Automatically transfer the text to replica database instances.
For all of the above, you would need the text to be stored in the database. If the text is outside the database, those features won't work.
With respect to the VARCHAR vs. TEXT, I can answer for MySQL (though you mentioned VARCHAR(MAX) so you might be referring to Microsoft SQL Server).
In MySQL, both VARCHAR and TEXT max out at 64KB in bytes. If you use
a multibyte character set, the max number of characters is lower.
Both VARCHAR and TEXT have a character set and collation.
VARCHAR allows a DEFAULT, but TEXT does not.
Internally the InnoDB storage engine, VARCHAR and TEXT are stored identically (as well as VARBINARY and BLOB and all their cousins). See https://www.percona.com/blog/2010/02/09/blob-storage-in-innodb/

Pictures using Postgres and Xojo

I have converted from a MySQL database to Postgres. During the conversion, the picture column in Postgres was created as bytea.
This Xojo code works in MySQL but not Postgres.
Dim mImage as Picture
mImage = rs.Field("Picture").PictureValue
Any ideas?
I don't know about this particular issue, but here's what you can do to find out yourself, perhaps:
Pictures are stored as BLOBs in the database. Now, this means that the column must also be declared as BLOB (or a similar binary type). If it was accidentally marked as TEXT, this would work as long as the database does not get exported by other means. I.e, as long as only your Xojo code reads and writes to the record, using the PictureValue functions, that takes care of keeping the data in BLOB form. But if you'd then convert to another database, the BLOB data would be read as text, and in that process it might get mangled.
So, it may be relevant to let us know how you converted the DB. Did you perform a export as SQL commands and then imported it into Postgres by running these commands again? Do you still have the export file? If so, find a record with picture data in it and see if that data is starting with: x' and then contains hex byte code, e.g. x'45FE1200... and so on. If it doesn't, that's another indicator for my suspicion.
So, check the type of the Picture column in your old DB first. If that specifies a binary data type, then the above probably does not apply.
Next, you can look at the actualy binary data that Xojo reads. To do that, get the BlobValue instead of the PictureValue, and store that in a MemoryBlock. Do the same for a single picture, both with the old and the new database. The memoryblock should contain the same bytes. If not, that would suggest that the data was not transferred correctly. Why? Well, that depends on how you converted it.

Saving LZW encoded data into a mySQL database

I send zipped data, which is done with LZW compression at the client side with js, to server. The problem is that the data becomes corrupted after saving it to database with SQL.
So my question is, what collation should i use, to accomplish that? (My current one is latin1-default collasion)
I already checked if the problem arrieses during the data transferring from client to server and vice versa by sending encoded data to HTTP-Server and sending(PHP-echo) it back immeadiatly without processing it. I could decode LZW properly. So it should definitely be a problem with the database.
More information about the schema: I only have a single table with a 3 cols. "data" is type of "BLOB". (I also tried TEXT). user_id is INT and type is VARCHAR.
This is how i save the data:
INSERT INTO svg.saved_data (user_id, data, type) VALUES ('".$user_id."', '".$data."', '".$type."');
I don't know what your platform is, but the link below gives you a general recipe in php. It has an example which should do what you need.
https://stackoverflow.com/questions/17/binary-data-in-mysql

How can I insert a file into a table in SQL?

I haven't seen any data type that can store a file in SQL. Is there something like that? What I'm particularly talking about is that I want to insert into my table a source code. What is the best method to do it? It can be either stored in my database as a nicely formatted text, or better (what I actually want) to store it as a single file. Please note that I'm using MySQL.
It is best not to store a file in your SQL database but to store a path to the file in the server or any other UNC path that your application can retrieve by itself and do with it what ever is unnecessary.
see this: https://softwareengineering.stackexchange.com/questions/150669/is-it-a-bad-practice-to-store-large-files-10-mb-in-a-database
and this:
Better way to store large files in a MySQL database?
and if you still want to store the file on the DB.. here is an example:
http://mirificampress.com/permalink/saving_a_file_into_mysql
If you can serialized the file you can store it as binary and then deserialize when needed
http://dev.mysql.com/doc/refman/5.0/en/binary-varbinary.html
You can also use a BLOB (http://dev.mysql.com/doc/refman/5.0/en/blob.html) which has some differences. Normally I just store the file in the filesystem and a pointer in the DB, which makes serving it back via something like HTTP a bit easier and doesn't bloat up the Database.
Storing the file in a table only makes sense if you need to do searches in that code. In other cases, you should only store a file's URL.
If you want to store a text file, use the TEXT datatype. Since it is a source code, you may consider using the ASCII character set to save space - but be aware that this will cause character set conversions during your queries, and this affects performances. Also, if it is ASCII you can use REGEXP for searches (that operator doesnt work with multi-byte charsets).
To load the file, if the file is on the same server as MySQL, you can use the FILE() function within an INSERT.

Importing a CSV file into mysql. (Specifically about create table command)

I hava text file full of values like this:
The first line is a list of column names like this:
col_name_1, col_name_2, col_name_3 ......(600 columns)
and all the following columns have values like this:
1101,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1101,1,3.86,65,0.46418,65,0.57151...
What is the best way to import this into mysql?
Specifically how to come up with the proper CREATE TABLE command so that the data will load itself properly? What is the best generic data type which would take in all the above values like 1101 or 3.86 or 0.57151. I am not worried about the table being inefficient in terms of storage as I need this for a one time usage.
I have tried some of the suggestions in other related questions like using Phpmyadmin (it crashes I am guessing due to the large amount of data)
Please help!
Data in CSV files is not normalized; those 600 columns may be spread across a couple of related tables. This is the recommended way of treating those data. You can then use fgetcsv() to read CSV files line-by-line in PHP.
To make MySQL process the CSV, you can create a 600 column table (I think) and issue a LOAD DATA LOCAL INFILE statement (or perhaps use mysqlimport, not sure about that).
The most generic data type would have to be VARCHAR or TEXT for bigger values, but of course you would lose semantics when used on numbers, dates, etc.
I noticed that you included the phpmyadmin tag.
PHPMyAdmin can handle this out of box. It will decide "magically" which types to make each column, and will CREATE the table for you, as well as INSERT all the data. There is no need to worry about LOAD DATA FROM INFILE, though that method can be more safe if you want to know exactly what's going on without relying on PHPMyAdmin's magic tooling.
Try convertcsvtomysql, just upload your csv file and then you can download and/or copy the mysql statement to create the table and insert rows.