storing complex perl data structures in MySQL - mysql

I work on a large perl website that currently stores all the configurations in a perl module. I have the need to move these settings into MySQL. The problem is the settings are defined in lots of variables and most of them are complex structures (like hash of hashes and array of hashes).
My first idea was to use either XML, YAML, or Storable perl module to easily write and read the variables from a simple file, but my boss doesn't want either of these solutions. He wants it to be stored in MySQL, so other solutions are not an option.
My question is, does anybody know about any CPAN module(s) that will help me to do that task; what I basically need is a way to map all the perl complex perl structures I have to MySQL tables.
Any suggestion would be really appreciated. Thanks!

Option 1: Store the data in serialized form (Data::Dumper, Storable, JSON, etc...) in MySQL's TEXT/MEDIUMTEXT/LONGTEXT type field (65KB/16MB/4GB max sizes respectively)
Option 2: Use DBIx ORM (Object-to-Relational-Mapping), which is the way to automatically map Perl data to DB tables (similar to Java's Hybernate). You'll need to convert your data structures to objects as far as I'm aware, though there may be DBIx module that can deal with non-blessed data structures.
Frankly, unless you need to manipulate the config data in detail within MySQL piece by piece, option #1 is dramatically simpler. However, if your boss's goal is to be able to either query details of configuration, or manipulate its individual elements one by one, you will have to go with #2.

Why you don't want to use Storable qw(freeze thaw) with MySQL?

Related

How to put json object in database automatically?

I have a very large Json object that i want to put in a nosql database.
I would like to know:
first, how to generate the database schema based on that Json object?
second, is there a way to put this object automatically in the database, without manually specifying which value (in json object) goes in which column (in the database)?
I hope I was clear enough. Thanks!
Since you haven't specified which NoSQL database you're using in particular, for convenience, I'll assume you're using MongoDB when I talk about things that are implementation specific.
First off, you should know that NoSQL databases by nature are "schema-less". You still could implement your own schema (in your app, not the db), but that's optional, and mostly done just for validation purposes or to let future developers understand the planned structure of your data better. Read the Dynamic Schemas section in this article to know more. Here is a SO answer explaining how you would do that in mongoose and here is the official guide/doc for it.
Second, NoSQL databases don't work in terms of columns or rows. Rather, you need to think in terms of collections and documents. So to answer your question : Yes, when you have a JSON object, you shove it in directly (before applying any required formatting if you've implemented a schema like in above). You don't enter data value by value (unless you've intentionally set it up to do so).
It sounds to me that you need to strengthen your fundamental understanding of how NoSQL works as you seem to be confusing yourself with concepts that belong to other DBMS. Here is a neat slideshow to get you started and the previous article I linked you to also gives you a decent introduction.
After you're done, consider installing MongoDB or something similar and just playing around with the command line interface to get a good hang of it.

Dynamic JSON file vs API

I am designing a system with 30,000 objects or so and can't decide between the two: either have a JSON file pre computed for each one and get data by pointing to URL of the file (I think Twitter does something similar) or have a PHP/Perl/whatever else script that will produce JSON object on the fly when requested, from let's say database, and send it back. Is one more suited for than another? I guess if it takes a long time to generate the JSON data it is better to have already done JSON files. What if generating is as quick as accessing a database? Although I suppose one has a dedicated table in the database specifically for that. Data doesn't change very often so updating is not a constant thing. In that respect the data is static for all intense and purposes.
Anyways, any thought would be much appreciated!
Alex
You might want to try MongoDB which retrieves the objects as JSON and is highly scalable and easy to setup.

How to convert data stored in XML files into a relational database (MySQL)?

I have a few XML files containing data for a research project which I need to run some statistics on. The amount of data is close to 100GB.
The structure is not so complex (could be mapped to perhaps 10 tables in a relational model), and given the nature of the problem, this data will never be updated again, I only need it available in a place where it's easy to run queries on.
I've read about XML databases, and the possibility of running XPATH-style queries on it, but I never used them and I'm not so comfortable with it. Having the data in a relational database would be my preferred choice.
So, I'm looking for a way to covert the data stored in XML into a relational database (think of a big .sql file similar to the one generated by mysqldump, but anything else would do).
The ultimate goal is to be able to run SQL queries for crunching the data.
After some research I'm almost convinced I have to write it on my own.
But I feel this is a common problem, and therefore there should be a tool which already does that.
So, do you know of any tool that would transform XML data into a relational database?
PS1:
My idea would be something like (it can work differently, but just to make sure you get my point):
Analyse the data structure (based on the XML themselves, or on a XSD)
Build the relational database (tables, keys) based on that structure
Generate SQL statements to create the database
Generate SQL statements to create fill in the data
PS2:
I've seen some posts here in SO but still I couldn't find a solution.
Microsoft's "Xml Bulk Load" tool seems to do something in that direction, but I don't have a MS SQL Server.
Databases are not the only way to search data. I can highly recommend Apache Solr
Strategies to Implement search on XML file
Keep your raw data as XML and search it using the Solr index
Importing XML files of the right format into a MySql database is easy:
https://dev.mysql.com/doc/refman/5.6/en/load-xml.html
This means, you typically have to transform your XML data into that kind of format. How you do this depends on the complexity of the transformation, what programming languages you know, and if you want to use XSLT (which is most probably a good idea).
From your former answers it seems you know Python, so http://xmlsoft.org/XSLT/python.html may be the right thing for you to start with.
Take a look at StAX instead of XSD for analyzing/extraction of data. It's stream based and can deal with huge XML files.
If you feel comfortable with Perl, I've had pretty good luck with XML::Twig module for processing really big XML files.
Basically, all you need is to setup few twig handlers and import your data into MySQL using DBI/DBD::mysql.
There is pretty good example on xmltwig.org.
If you comfortable with commercial products, you might want to have a look at Data Wizard for MySQL by the SQL Maestro Group.
This application is targeted especially at exporting and, of course, importing data from/ to MySQL databases. This also includes XML import. You can download a 30-day trial to check if this is what you are looking for.
I have to admit that I did not use the MySQL product line from them yet, but I had a good user experience with their Firebird Maestro and SQLite Maestro products.

Can MySql load from XML directly

I am aware of the batch LOAD XML technique e.g. Load XML Update Table--MySQL
Can MySql insert/replace rows directly from xml. I'd like to pass an XML string to MySQL.
Something like replace into user XML VALUES maybe even using as to map the tags to the column names??
The primary thing is that I dont want to parse the XML in my code, I'd like MySql to handle this. I dont have a file, I have the XML as a string.
I have looked and found there are some XML Functions:
12.11. XML Functions
The XML functions can do XPath, but I think this is a little fiddly as I have a 1:1 mapping from the XML to the table structure so I'd hjst like to be able to say hey MySql, insert the values in the xml string in to the table.
Is this possible?
In a nutshell, No.
What your looking for is an XML storage engine for MySQL. There has never been one created officially, and i have never seen a third party one either (but feel free to google).
If you really want to achieve this, then the closest you would get is to look for an alternative (R)DMS, but then that might not support the type of queries you wish to perform, may require a bit of a learning curve, would no doubt require you are using a server with superuser access, and potentially mean re-factoring a lot of your code.

Create some tool for converting data from one database to another

This is kind of implementation question maybe. I wonder if I where to make a tool to convert some relational database to some other kind of database. What would the approach be?
If I for example want to convert data and the structure from a mysql database to mssql. Would I need to use regular expression to parse the SQL-file? Or maybe I could convert it to XML or JSON first and from that structure parse into my targeted database?
Using existing tools for converting mysql to mssql or anything similar is not in this scope. Since I want to know how it is actually done.
Well it's kind of a broad question, but generally speaking, having your own abstract representation of the structure and data would be a good thing, because you could extend your system "easily" by writing importers and exporters, and actually decouple your code a little by abstracting the relational db concepts into your own format.
The importers would "reverse engineer" a given database, by converting it to your own representation (as you say, xml/json or even your own query language -that would be better I guess-). Then the exporters would just convert from your format to the requested SQL dialect. No regular expressions, no other stuff "hardcoded".
This will allow you to extend your system and support a bigger number of sources and targets, and also handle errors like some SQL features from a "source" not supported in the selected "target".
My 2 cents, hope it helps!