How to store Buffer in MySQL with Node.js?
One way I know is to convert Buffer to hex string and save it as CHAR type in MySQL. But is it the best practice to transform before and after saving in MySQL?
Is there a way that can directly save and get the Buffer (bytes array) in MySQL with Node.js, for example, using BLOB in MySQL?
Or actually it doesn't matter what kind of way I use, they don't differ so much?
I am not aware of a best practice. But I can share what works for me.
If I'm working with objects, I prefer storing them as base64 (to cut down on character length over base16) in a longtext column and gzipping them .
It's more helpful to store them as strings, I think, because if you store them as buffers, then I don't think there is a way to copy them out of the database (it's really helpful troubleshooting sometimes), and translate them back to your object - at least, I haven't found a way to do that.
For example (assuming Nodejs 6+ for Buffer.from() syntax, otherwise use new Buffer()):
const obj = {};
const zip = zlib.gzipSync(JSON.stringify(obj)).toString('base64');
And, undoing:
const originalObj = JSON.parse(zlib.unzipSync(Buffer.from(zip, 'base64')));
Since longtext equals ~4GiB, you shouldn't ever hit the limit, unless you have massive objects. But then I suppose you have an efficiency problem elsewhere.
Note: There is obviously a lot packed into one line there (i.e. zip, parse, buffer, etc.).
A lot can fail in that one line.
So make sure you handle it appropriately.
Related
I have plenty of base64_encoded strings which I save in my MySQL DB. However, since base64_encoded strings are 33% larger, I would like to know how I can optimize my DB storage (Now I store my strings in a LONGTEXT field, but what about LONGBLOB or something similar?). So, how I can save my data (which is now base64 encoded) more optimized so I save storage space... (When selecting the value, I still have to be able to encode it base64 safely again).
Thanks
If you have 5.6.1 or later, FROM/TO_BASE64() are available in MySQL.
Base64 is plain ascii text. Storing base64 in a TEXT is no problem. No escaping is necessary since (I think) there are no naughty characters. But storing the result of FROM_BASE64 needs a BLOB (of some size).
So...
INSERT INTO t (myblob) VALUES ( FROM_BASE64('UmljayBKYW1lcw==') );
SELECT TO_BASE64(myblob) FROM t;
The FROM_BASE64 will reconstruct the original (pre-base64) data.
Even better would be to compress instead:
INSERT INTO t (myblob) VALUES ( COMPRESS('UmljayBKYW1lcw==') );
SELECT UNCOMPRESS(myblob) FROM t;
That way, you will (probably) get more than the 33% _if the original (pre-base64) data was compressible. If the original was a .jpg or some other already-compressed data, then you may as well use FROM_BASE64 instead of COMPRESS.
i'm using a boolean array in memory to store a collision map of this size : 16*16*14*16 = 57344 booleans . And i must store it in a json file for 50 different maps.I've been searching for the best way to store it in my json file:
Compress all the json document in base64 or other encoding
Try to compress in a better readable way the collisionMap what is the big part
In my try of doing this (but maybe i'm reinventing the wheel) i've made this example:
000111101111111111100000000001111000
0_3_4_1_11_10_4_3
the first number point that starts with a 0 , and the next number
means that there is 3 zero , nest to it 4 ones , 1 zero ...
Maybe with this you can't see too much problem , but with 57344 booleans can be still big is there is many variation.
but i don't know if there is another better way to store it .
Any idea if there is a good solution?
I have to generate codes with custom fields: id of field+name of field+values of the field.
How long is the data I can encode inside the QRcode? I need to know how many fields\values I can insert.
Should I use XML or JSON or CSV? What is most generic and efficient?
XML / JSON will not qualify for a QR code's alphanumeric mode since it will include lower-case letters. You'll have to use byte mode. The max is 2,953 characters. But, the practical limit is far less -- perhaps a few hundred characters.
It is far better to encode a hyperlink to data if you can.
As Terence says, no reader will do anything with XML/JSON except show it. You need a custom reader anyway to do something useful with that data. (Which suggests this is not a good use case for QR codes.) But if you're making your own reader, you can use gzip compression to make the payload much smaller. Your reader would know to unzip it.
You might get away with something workable but this is not a good approach in general.
The maximum number of alphanumeric characters you can have is 4,296. Although this will require the lowest form of error correction and will be very hard to scan.
JSON is generally more efficient at data storage than XML.
However, you will need to write your own app to scan the code - I don't know of any which will process raw JSON or XML. All the scanners will show you the text, though.
I want to save a hash as a packed string in a db, I get the pack part down ok, but I'm having a problem getting the hash back
test hash
my $hash = {
test_string => 'apples,bananas,oranges',
test_subhash => { like => 'apples' },
test_subarray => [ red, yellow, orange ]
}
I thought maybe I could use JSON:XS like in this example to convert the hash to a json string, and then packing the JSON string...
Thoughts on this approach?
Storable is capable of storing Perl structures very precisely. If you need to remember that something is a weak reference, etc, you want Storable. Otherwise, I'd avoid it.
JSON (Cpanel::JSON::XS) and YAML are good choices.
You can have problems if you store something using one version of Storable and try to retrieve it using an earlier version. That means all machines that access the database must have the same version of Storable.
Cpanel::JSON::XS is faster than Storable.
A fast YAML module is probably faster than Storable.
JSON can't store objects, but YAML and Storable can.
JSON and YAML are human readable (well, for some humans).
JSON and YAML are easy to parse and generate in other languages.
Usage:
my $for_the_db = encode_json($hash);
my $hash = decode_json($from_the_db);
I don't know what you mean by "packing". The string produces by Cpanel::JSON::XS's encode_json can be stored as-is into a BLOB field, while the string produced by Cpanel::JSON::XS->new->encode can be stored as-is into a TEXT field.
You may want to give the Storable module a whirl.
It can :
store your hash(ref) as a string with freeze
thaw it out at the time of retrieval
There are a lot of different ways to store a data structure in a scalar and then "restore" it back to it's original state. There are advantages and disadvantages to each.
Since you started with JSON, I'll show you can example using it.
use JSON;
my $hash = {
test_string => 'apples,bananas,oranges',
test_subhash => { like => 'apples' },
test_subarray => [ red, yellow, orange ]
}
my $stored = encode_json($hash);
my $restored = decode_json($stored);
Storable, as was already suggested, is also a good idea. But it can be rather quirky. It's great if you just want your own script/system to store and restore the data, but beyond that, it can be a pain in the butt. Even transferring data across different operating systems can cause problems. It was recommended that you use freeze, and for most local applications, that's the right call. If you decide to use Storable for sending data across multiple machines, look at using nfreeze instead.
That being said, there are a ton of encoding methods that can handle "storing" data structures. Look at YAML or XML.
I'm not quite sure what you mean by "convert the hash to a JSON string, and then packing the JSON string". What further "packing" is required? Or did you mean "storing"?
There's a number of alternative methods for storing hashes in a database.
As Zaid suggested, you can use Storable to freeze and thaw your hash. This is likely to be the fastest method (although you should benchmark with the data you're using if speed is critical). But Storable uses a binary format which is not human readable, which means that you will only be able to access this field using Perl.
As you suggested, you can store the hash as a JSON string. JSON has the advantage of being fairly human readable, and there are JSON libraries for most any language, making it easy to access your database field from something other than Perl.
You can also switch to a document-oriented database like CouchDB or MongoDB, but that's a much bigger step.
I have a MySQL query that returns a result with a single column of integers. Is there any way to get the MySQL C API to transfer this as actually integers rather than as ASCII text? For that matter is there a way to get MySQL to do /any/ of the API stuff as other than ASCII text. I'm thinking this would save a bit of time in sprintf/sscanf or whatever else is used as well as in bandwidth.
You're probably out of luck, to be honest. Looking at the MySQL C API (http://dev.mysql.com/doc/refman/5.0/en/mysql-fetch-row.html, http://dev.mysql.com/doc/refman/5.0/en/c-api-datatypes.html, look at MYSQL_ROW) there doesn't seem to be a mechanism for returning data in its actual type... the joys of using structs I guess.
You could always implement a wrapper which checks against the MYSQL_ROW's type attribute (http://dev.mysql.com/doc/refman/5.0/en/c-api-datatypes.html) and returns a C union, but that's probably poor advice; don't do that.