Saving LZW encoded data into a mySQL database - mysql

I send zipped data, which is done with LZW compression at the client side with js, to server. The problem is that the data becomes corrupted after saving it to database with SQL.
So my question is, what collation should i use, to accomplish that? (My current one is latin1-default collasion)
I already checked if the problem arrieses during the data transferring from client to server and vice versa by sending encoded data to HTTP-Server and sending(PHP-echo) it back immeadiatly without processing it. I could decode LZW properly. So it should definitely be a problem with the database.
More information about the schema: I only have a single table with a 3 cols. "data" is type of "BLOB". (I also tried TEXT). user_id is INT and type is VARCHAR.
This is how i save the data:
INSERT INTO svg.saved_data (user_id, data, type) VALUES ('".$user_id."', '".$data."', '".$type."');

I don't know what your platform is, but the link below gives you a general recipe in php. It has an example which should do what you need.
https://stackoverflow.com/questions/17/binary-data-in-mysql

Related

load json files in google cloud storage into big query table

I am trying to do it with client lib using python.
the problem I am facing is that the TIMESTAMP on the JSON files are on Unix epoch TIMESTAMP format and big query can't detect that:
according to documentation:
so I wonder what to do?
I thought about changing the JSON format manually before I load it into BigQuery table?
Or maybe looking for an auto conversion from the BigQuery side?
I wondered across the internet and could not find anything useful yet.
Thanks in advance for any support.
You have 2 solutions
Either you update the format before the BigQuery integration
Or you update the format after the BigQuery integration
Before
Before means updating your JSON (manually or by script) or to update it by the process that load the JSON into BigQuery (like Dataflow).
I personally don't like this, file handling are never funny and efficient.
After
In this case, you let BigQuery loading your JSON file into a temporary table and convert your UNIX timestamp into a Number or a String. Then, perform a request into this temporary table, convert the field in the correct timestamp format, and insert the data in the final table.
This way is smoother and easier (a simple SQL query to write). However, it implies cost to read all the loaded data (to write them then)

Insert image into binary field MySQL, NodeJS

First of all, i understand that storing images into DB is a bad practice, however i need to do it.
I have a table in MySQL (..., img BINARY, ..) and have some images on my machine.
How should I transform my input data, to make it avaliable store in BINARY field? And how to decode it into original format again?

Decode json data while querying data from mysql database

Is it possible to retrieve decoded data from a mysql db column in which data is saved as json encoded text ie. instead of fetching json encoded text from the db and decoding it separately, is there any method to fetch decoded data from the select query itself?
Any help?
Thanks in advance!
The answer linked by jyoti mishra is quite good; MySQL handles JSON data internally now and json_extract is definitely the preferred solution if you're on MySQL 5.7 or later.
However, if you're not, you can use the phpMyAdmin "Transformation" feature to modify the displayed output when you're viewing through phpMyAdmin.
To start, you have to have at least partially configured the "phpMyAdmin configuration storage". From the table's Structure tab, "Change" the column containing the JSON data, then look for the "Browser display transformation" dropdown and select JSON. See this photo for an example:

Insert HTML data to SQL Server

I'd like to insert HTML data in SQL Server using ASP.NET and a stored procedure but I'm not able to insert all the tags and double quotes. Any help would be appreciated.
You can use base64 encoding to store and decoding when reading the data
First try to strip down your problem to the basics:
SQL-Server has a datetype nvarchar(max). This typ can store a lot of data. All characters that "arive" at the database can be inserted. You shoul first check, what data arives at the server. An easy way is SQL prifiler. You should see the data in Profiler, that "arives" at the database. If things are broken here (i.e.wrong encoding) you should the way up to you application.
Which way comes the data? You only mention ASP.NET. You can connect to the database using ODBC, JDBC, OLEDB or ADO-driver (there might be more). Data can be converterd on this way. You can set up a Trace an see what happens. The easy way normaly is the fastest: Check the options of the driver/try other settings. Most drivers have only a small number of options. If you can not find the problem here...
Your application. Strip it down to the minimal functionality: Establish a connection to the database an insert some data into a table. Does this work? Does it work with different collation? Does it word with unicode data?
Maybe a better question could better be answered.

Pictures using Postgres and Xojo

I have converted from a MySQL database to Postgres. During the conversion, the picture column in Postgres was created as bytea.
This Xojo code works in MySQL but not Postgres.
Dim mImage as Picture
mImage = rs.Field("Picture").PictureValue
Any ideas?
I don't know about this particular issue, but here's what you can do to find out yourself, perhaps:
Pictures are stored as BLOBs in the database. Now, this means that the column must also be declared as BLOB (or a similar binary type). If it was accidentally marked as TEXT, this would work as long as the database does not get exported by other means. I.e, as long as only your Xojo code reads and writes to the record, using the PictureValue functions, that takes care of keeping the data in BLOB form. But if you'd then convert to another database, the BLOB data would be read as text, and in that process it might get mangled.
So, it may be relevant to let us know how you converted the DB. Did you perform a export as SQL commands and then imported it into Postgres by running these commands again? Do you still have the export file? If so, find a record with picture data in it and see if that data is starting with: x' and then contains hex byte code, e.g. x'45FE1200... and so on. If it doesn't, that's another indicator for my suspicion.
So, check the type of the Picture column in your old DB first. If that specifies a binary data type, then the above probably does not apply.
Next, you can look at the actualy binary data that Xojo reads. To do that, get the BlobValue instead of the PictureValue, and store that in a MemoryBlock. Do the same for a single picture, both with the old and the new database. The memoryblock should contain the same bytes. If not, that would suggest that the data was not transferred correctly. Why? Well, that depends on how you converted it.