Mondrian - Fact Table Data as XML - mysql

I am evaluating a Mondrian-Saiku solution for a client.
After analyzing their current database schemas, I realize that what constitutes as their 'fact table data' is currently being stored in XML's. The XML 's themselves are stored as blob datatypes in a MySQL table. Think of it like this: the table holds all the transactions of the company; the details of each transaction are stored in their own XML; each XML string is stored as one of the field values in a given transaction row.
This presents a slight dilemma since the Mondrian XML schema requires the explicit use of column names.
Short of having to extract and transfer the XML data to new tables (not realistic for my purposes due to the size of data and dependencies from other systems), is there any way I can work my client's existing setup for the purposes of a Mondrian-Saiku implementation?

You need to expose the data in a traditional table way. What is the database here? Can you create a database view which does some xml processing on the XML in the blob and exposes the columns?
Alternatively maybe something like composite or jboss teiid can help here. These tools allow you to expose as a standard looking table, virtually anything. It may not be quick enough though!

Related

Create a view in SnowFlake dynamically using JSON string

I need to create dynamic views from JSON string data
create or replace view schema.vw_tablename copy grants as
SELECT
v:Duration::int Duration,
v:Connectivity::string Connectivity
...
from public.tablename
This is a kind of manual view for one of the table but i want to code in generic way so that i will pass the table name which is having JSON data and view will be created and output will be tabular format.
If you are wanting to have the view created in snowflake driven by data (as compared to using a tool to create the views client side, which we do in our company) I think you only hope will be stored procedures. In the detailed usage doc's it reminds you DDL operations commits the current transaction (which is always good to remember) but also implies that you can do DDL, which is what you are asking. Which means you should be able to write some javascript that builds the create view command you are want based on data handed to it.
There is a nice 2 part blog that handles this requirement. Similar to what is mentioned in Simeon Pilgrim's answer, the blog also uses a Stored Proc to generate the View. Albeit it does so using Snowflake SQL.
https://www.snowflake.com/blog/automating-snowflakes-semi-structured-json-data-handling/
https://www.snowflake.com/blog/automating-snowflakes-semi-structured-json-data-handling-part-2/

MySQL: Automate Data Ingestion from regular txt/csv files to a Database

Intro
I've searched all around about this problem, but I didn't really found a source of knowledge about this, so I'm sorry if this problem seems basic to you, but for me is rather quite intriguing due the fact that I'm having hard time to guess what keywords to use on google in order to retrieve proper info.
Problem Description :
As a matter of fact, i have to issues that i don't know how to deal in a MySQL instance installed in a laptop in a windows environment:
I have a DB in MySQL with 50 tables, of with 15 or 20 tables are tables with original data. The other tables were tables that i generated from the original data tables, in order to properly create tables that would allow me to analyze data in PowerBI. The original data tables are fed by dumps from a ERP Database.
My issue is the following:
How would one automate the process of receiving cumulative txt/csv files (via pen-drive or any other transfer mechanism), store those files into a folder and then update the existing tables with the new information? Is there any reference of best practices to deal with such a scenario?
How can i maintain the good shape of my database with the successive data integration, I mean, how can I make my database scalable and responsive?
Can you point me some sources that would help me with this?
At the moment I imported data into tables, in 2 steps:
1st - I created the table structure with the Workbench import wizard help ( I had to do it this way because the tables have a lot of fields - dozens of them, literally, and those fields need to be in the database). I also inserted primary keys and indexes in those tables;
2nd - I Managed to load the data from the files into those tables, using LOAD DATA IN FILE command.
Some of the fields of the tables created with the import wizard, were created as data type text, with is not necessary in this scenario. I would like to revert those fields to data type NVARCHAR(255) or something, However there are a lot of field to alter the data type and in multiple tables at this point, and i was wondering if i can write a query to do the job of creating all the ALTER TABLES statements i need.
So my issue here is: is it safe to alter the data type in multiple fields in multiple columns (in this case i would like to change fields with datatype text to NAVARCHAR(255))? What is the best way to do this? Can you point me to some sources or best practices for this, please?
Thank you, in advance, for your help.
Cheers
You need a scripting language, not a UI. See mysql commandline tool, the shell of your OS, etc, etc.
DROP DATABASE and reCREATE it
LOAD DATA
Massage the data to get the columns cleaner than what the load data provided
Sic the BI tool on the data.
If you want to discuss Step 3, we need details about what transformations are needed between step 2 and step 4. That includes providing the format or schema for steps 2 and 4.

What is the best way to store a pretty large JSON object in MySQL

I'm building a Laravel app the core features are driven with rather large JSON objects. (the largest ones are between 1000-1500 lines).
I know there are better data base choices than MySQL for storing files and blocks of data, but for various reasons I will need to use MySQL for the application.
So my question is, how to I store my JSON objects most effective in MySQL? I will not need to do any queries on the column that holds the data, there will be other columns for identifying it. Something like this:
id, title, created-at, updated-at, JSON-blobthingy
Any ideas?
You could use the JSON data type if you have MySQL version 5.7.8 or above.
You could store the JSON file on the server, and simply reference its location via MySQL.
You could use also one of the TEXT types.
The best answer i can give is to use MySQL 5.7. On this version the new column type JSON are supported. Which handles large JSON very well (obviously).
https://dev.mysql.com/doc/refman/5.7/en/json.html
You could compress the data before inserting it if you don't need it searchable. I'm using the 'zlib' library for that
Simply, you can use the type longblob which can handle up to 4GB of data for the column holding the large JSON object where you can insert, update, and read this column normally as if it is text or anything else!

Migrate new database with exceeding old database value

I need to migrate the exceeding database value with new one. I have two database like test and test new. I create the both database with same data. I made the all changes in test now I need migrate that changes in test new without affecting existing value.
If table schema is different, how will I then go about doing this? In my prev job, what I did was import data (in my case, from Access) into my destination (MySQL) leaving table structures, then use SQL to select data and manipulate as required into final destination tables.
in my case, where I don't have documentation for the old database, and the columns was not named correctly, e.g. it uses say 'field1', 'field2' etc. I needed to trace from the application code what the columns mean. Is there any better way? Also, sometimes columns contain multiple values in delimited data, is reading code the only way?
It sounds like you know what to do, but are just not keen to do it.
If there is no documentation then it makes sense that you will have to go to the code to figure out what it does. Regarding porting it across you will most likely have to write custom scripts that pull the data, manipulate it and insert it into the new table based on the new structure.
There are some tools to generate migration scripts - i.e. scripts that generate inserts for all your data. I think mysql workbench does it, but it most likely won't be sufficient since your tables have different structures.

Convert database table structure to XSD format

Is there any way i can convert a table struture in a MySQL or Oracle database to XSD (XML Schema Definition) format ?.
Thank You.
use XML Spy.
http://williamjxj.wordpress.com/2011/05/25/1004/
Yes, but it's fairly complicated. You'll want to run the query SHOW CREATE TABLE <tablename> and it will return the full table creation statement (in tidy CREATE TABLE syntax).
Then you'll want to parse each line of the create table syntax using your language. Thankfully the fields are neatly separated by newlines.
The types should be fairly easy to map to XSD types.
Where it gets complicated is when you're parsing foreign key relationships - then you'll need to define custom types in your XSD and reference them accordingly.
It really comes down to your implementation. If you're looking for a portable data format that you can easily import/export from your database then there are a number of other solutions.